[ 509.044512] env[68442]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 509.679283] env[68492]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 511.016459] env[68492]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=68492) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.016882] env[68492]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=68492) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.016936] env[68492]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=68492) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 511.017239] env[68492]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 511.211203] env[68492]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=68492) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 511.221167] env[68492]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=68492) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 511.325999] env[68492]: INFO nova.virt.driver [None req-6c373783-0ee4-4e23-952d-d974b9d198d4 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 511.400976] env[68492]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 511.401095] env[68492]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 511.401162] env[68492]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=68492) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 514.303502] env[68492]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-92cf478a-0a4b-46c2-b911-f6f17742ea41 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.320544] env[68492]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=68492) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 514.320670] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-1f4851a5-009e-4de5-812d-0ff6b6c9fe43 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.353824] env[68492]: INFO oslo_vmware.api [-] Successfully established new session; session ID is c2378. [ 514.354019] env[68492]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.953s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.354519] env[68492]: INFO nova.virt.vmwareapi.driver [None req-6c373783-0ee4-4e23-952d-d974b9d198d4 None None] VMware vCenter version: 7.0.3 [ 514.358164] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c7ec40e-c352-4f5a-aef2-fae86c5640d0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.380177] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d4797df-3bbb-4cc1-bf58-f6ebba975113 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.386423] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c75df744-6d04-4841-b94d-d0e884adbcbf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.393296] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bf6bf6b-15d0-41b0-ada8-22d64e2303aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.406778] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e206500-cb11-459a-8762-b9d6e1ddf2be {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.412948] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cce7a900-75b7-45d1-9b6b-156a720683ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.444294] env[68492]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-a5f7e7ac-a2d5-4cff-8f82-9fefde0c96c9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.449756] env[68492]: DEBUG nova.virt.vmwareapi.driver [None req-6c373783-0ee4-4e23-952d-d974b9d198d4 None None] Extension org.openstack.compute already exists. {{(pid=68492) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 514.452406] env[68492]: INFO nova.compute.provider_config [None req-6c373783-0ee4-4e23-952d-d974b9d198d4 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 514.472424] env[68492]: DEBUG nova.context [None req-6c373783-0ee4-4e23-952d-d974b9d198d4 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),8c210de8-cb57-4595-9d64-9bcffc8a09c2(cell1) {{(pid=68492) load_cells /opt/stack/nova/nova/context.py:464}} [ 514.474342] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.474566] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.475277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.475715] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Acquiring lock "8c210de8-cb57-4595-9d64-9bcffc8a09c2" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.475906] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Lock "8c210de8-cb57-4595-9d64-9bcffc8a09c2" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.476855] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Lock "8c210de8-cb57-4595-9d64-9bcffc8a09c2" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.502442] env[68492]: INFO dbcounter [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Registered counter for database nova_cell0 [ 514.511196] env[68492]: INFO dbcounter [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Registered counter for database nova_cell1 [ 514.514354] env[68492]: DEBUG oslo_db.sqlalchemy.engines [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68492) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 514.514950] env[68492]: DEBUG oslo_db.sqlalchemy.engines [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68492) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 514.519299] env[68492]: DEBUG dbcounter [-] [68492] Writer thread running {{(pid=68492) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 514.520412] env[68492]: DEBUG dbcounter [-] [68492] Writer thread running {{(pid=68492) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 514.522213] env[68492]: ERROR nova.db.main.api [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 514.522213] env[68492]: result = function(*args, **kwargs) [ 514.522213] env[68492]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 514.522213] env[68492]: return func(*args, **kwargs) [ 514.522213] env[68492]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 514.522213] env[68492]: result = fn(*args, **kwargs) [ 514.522213] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 514.522213] env[68492]: return f(*args, **kwargs) [ 514.522213] env[68492]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 514.522213] env[68492]: return db.service_get_minimum_version(context, binaries) [ 514.522213] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 514.522213] env[68492]: _check_db_access() [ 514.522213] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 514.522213] env[68492]: stacktrace = ''.join(traceback.format_stack()) [ 514.522213] env[68492]: [ 514.523256] env[68492]: ERROR nova.db.main.api [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 514.523256] env[68492]: result = function(*args, **kwargs) [ 514.523256] env[68492]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 514.523256] env[68492]: return func(*args, **kwargs) [ 514.523256] env[68492]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 514.523256] env[68492]: result = fn(*args, **kwargs) [ 514.523256] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 514.523256] env[68492]: return f(*args, **kwargs) [ 514.523256] env[68492]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 514.523256] env[68492]: return db.service_get_minimum_version(context, binaries) [ 514.523256] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 514.523256] env[68492]: _check_db_access() [ 514.523256] env[68492]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 514.523256] env[68492]: stacktrace = ''.join(traceback.format_stack()) [ 514.523256] env[68492]: [ 514.523603] env[68492]: WARNING nova.objects.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 514.523757] env[68492]: WARNING nova.objects.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Failed to get minimum service version for cell 8c210de8-cb57-4595-9d64-9bcffc8a09c2 [ 514.524226] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Acquiring lock "singleton_lock" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 514.524422] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Acquired lock "singleton_lock" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 514.524711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Releasing lock "singleton_lock" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 514.525054] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Full set of CONF: {{(pid=68492) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 514.525225] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ******************************************************************************** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 514.525367] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] Configuration options gathered from: {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 514.525504] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 514.525692] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 514.525819] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ================================================================================ {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 514.526033] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] allow_resize_to_same_host = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526208] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] arq_binding_timeout = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526342] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] backdoor_port = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526468] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] backdoor_socket = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526628] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] block_device_allocate_retries = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526796] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] block_device_allocate_retries_interval = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.526968] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cert = self.pem {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527152] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527320] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute_monitors = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527484] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] config_dir = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527651] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] config_drive_format = iso9660 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527782] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.527942] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] config_source = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528134] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] console_host = devstack {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528332] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] control_exchange = nova {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528495] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cpu_allocation_ratio = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528655] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] daemon = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528820] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] debug = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.528974] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_access_ip_network_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.529180] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_availability_zone = nova {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.529348] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_ephemeral_format = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.529506] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_green_pool_size = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.529754] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.529923] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] default_schedule_zone = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530111] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] disk_allocation_ratio = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530293] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] enable_new_services = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530475] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] enabled_apis = ['osapi_compute'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530639] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] enabled_ssl_apis = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530796] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] flat_injected = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.530951] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] force_config_drive = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.531122] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] force_raw_images = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.531326] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] graceful_shutdown_timeout = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.531499] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] heal_instance_info_cache_interval = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.531720] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] host = cpu-1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.531893] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] initial_cpu_allocation_ratio = 4.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532330] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] initial_disk_allocation_ratio = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532330] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] initial_ram_allocation_ratio = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532466] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532612] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_build_timeout = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532775] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_delete_interval = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.532994] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_format = [instance: %(uuid)s] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533125] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_name_template = instance-%08x {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533290] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_usage_audit = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533459] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_usage_audit_period = month {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533622] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533787] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] instances_path = /opt/stack/data/nova/instances {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.533951] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] internal_service_availability_zone = internal {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534119] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] key = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534301] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] live_migration_retry_count = 30 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534478] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_config_append = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534645] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534802] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_dir = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.534958] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535103] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_options = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535266] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_rotate_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535431] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_rotate_interval_type = days {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535599] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] log_rotation_type = none {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535728] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.535853] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536028] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536201] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536332] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536493] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] long_rpc_timeout = 1800 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536653] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_concurrent_builds = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536809] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_concurrent_live_migrations = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.536972] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_concurrent_snapshots = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537174] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_local_block_devices = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537341] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_logfile_count = 30 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537499] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] max_logfile_size_mb = 200 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537655] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] maximum_instance_delete_attempts = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537822] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metadata_listen = 0.0.0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.537986] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metadata_listen_port = 8775 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538168] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metadata_workers = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538329] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] migrate_max_retries = -1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538494] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] mkisofs_cmd = genisoimage {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538702] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] my_block_storage_ip = 10.180.1.21 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538834] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] my_ip = 10.180.1.21 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.538996] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] network_allocate_retries = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.539225] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.539407] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] osapi_compute_listen = 0.0.0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.539570] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] osapi_compute_listen_port = 8774 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.539740] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] osapi_compute_unique_server_name_scope = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.539907] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] osapi_compute_workers = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540106] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] password_length = 12 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540309] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] periodic_enable = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540481] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] periodic_fuzzy_delay = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540653] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] pointer_model = usbtablet {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540820] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] preallocate_images = none {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.540980] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] publish_errors = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541127] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] pybasedir = /opt/stack/nova {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541290] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ram_allocation_ratio = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541474] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rate_limit_burst = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541656] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rate_limit_except_level = CRITICAL {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541817] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rate_limit_interval = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.541974] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reboot_timeout = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542148] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reclaim_instance_interval = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542308] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] record = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542476] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reimage_timeout_per_gb = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542640] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] report_interval = 120 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542797] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rescue_timeout = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.542955] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reserved_host_cpus = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543155] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reserved_host_disk_mb = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543326] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reserved_host_memory_mb = 512 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543487] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] reserved_huge_pages = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543647] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] resize_confirm_window = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543804] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] resize_fs_using_block_device = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.543963] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] resume_guests_state_on_host_boot = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.544147] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.544339] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rpc_response_timeout = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.544512] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] run_external_periodic_tasks = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.544684] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] running_deleted_instance_action = reap {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.544845] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] running_deleted_instance_poll_interval = 1800 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545021] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] running_deleted_instance_timeout = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545181] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler_instance_sync_interval = 120 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545352] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_down_time = 720 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545546] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] servicegroup_driver = db {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545734] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] shelved_offload_time = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.545894] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] shelved_poll_interval = 3600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.546089] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] shutdown_timeout = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.546274] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] source_is_ipv6 = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.546439] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ssl_only = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.546750] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.546930] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] sync_power_state_interval = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547108] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] sync_power_state_pool_size = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547281] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] syslog_log_facility = LOG_USER {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547439] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] tempdir = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547596] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] timeout_nbd = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547762] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] transport_url = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.547921] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] update_resources_interval = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548091] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_cow_images = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548256] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_eventlog = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548692] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_journal = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548692] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_json = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548783] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_rootwrap_daemon = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.548860] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_stderr = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549026] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] use_syslog = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549221] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vcpu_pin_set = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549403] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plugging_is_fatal = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549576] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plugging_timeout = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549741] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] virt_mkfs = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.549899] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] volume_usage_poll_interval = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.550071] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] watch_log_file = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.550278] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] web = /usr/share/spice-html5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 514.550484] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_concurrency.disable_process_locking = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.550785] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.550967] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.551152] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.551326] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_metrics.metrics_process_name = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.551495] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.551656] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.551836] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.auth_strategy = keystone {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552010] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.compute_link_prefix = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552224] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552419] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.dhcp_domain = novalocal {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552592] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.enable_instance_password = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552758] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.glance_link_prefix = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.552922] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553118] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.instance_list_cells_batch_strategy = distributed {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553283] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.instance_list_per_project_cells = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553442] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.list_records_by_skipping_down_cells = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553628] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.local_metadata_per_cell = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553770] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.max_limit = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.553934] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.metadata_cache_expiration = 15 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554122] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.neutron_default_tenant_id = default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554298] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.use_forwarded_for = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554465] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.use_neutron_default_nets = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554631] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554794] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_dynamic_failure_fatal = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.554957] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.555147] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_dynamic_ssl_certfile = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.555349] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_dynamic_targets = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.555519] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_jsonfile_path = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.555701] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api.vendordata_providers = ['StaticJSON'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.555894] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.backend = dogpile.cache.memcached {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556074] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.backend_argument = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556252] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.config_prefix = cache.oslo {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556454] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.dead_timeout = 60.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556629] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.debug_cache_backend = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556794] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.enable_retry_client = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.556957] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.enable_socket_keepalive = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557150] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.enabled = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557317] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.expiration_time = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557480] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.hashclient_retry_attempts = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557648] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.hashclient_retry_delay = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557807] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_dead_retry = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.557976] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_password = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.558155] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.558347] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.558520] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_pool_maxsize = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.558683] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_pool_unused_timeout = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.558845] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_sasl_enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559037] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_servers = ['localhost:11211'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559239] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_socket_timeout = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559418] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.memcache_username = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559587] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.proxies = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559752] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.retry_attempts = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.559920] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.retry_delay = 0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.560100] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.socket_keepalive_count = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.560309] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.socket_keepalive_idle = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.560489] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.socket_keepalive_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.560653] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.tls_allowed_ciphers = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561095] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.tls_cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561095] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.tls_certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561198] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.tls_enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561321] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cache.tls_keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561513] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561696] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.auth_type = password {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.561860] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562058] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.catalog_info = volumev3::publicURL {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562226] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562392] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562556] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.cross_az_attach = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562716] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.debug = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.562875] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.endpoint_template = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563048] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.http_retries = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563218] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563378] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563546] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.os_region_name = RegionOne {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563709] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.563868] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cinder.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564051] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564221] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.cpu_dedicated_set = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564413] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.cpu_shared_set = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564585] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.image_type_exclude_list = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564753] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.live_migration_wait_for_vif_plug = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.564966] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.max_concurrent_disk_ops = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.565158] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.max_disk_devices_to_attach = -1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.565329] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.565502] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.565667] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.resource_provider_association_refresh = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.565830] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.shutdown_retry_interval = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566024] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566212] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] conductor.workers = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566394] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] console.allowed_origins = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566558] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] console.ssl_ciphers = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566732] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] console.ssl_minimum_version = default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.566907] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] consoleauth.token_ttl = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.567123] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.567305] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.567510] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.567682] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.567845] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.568017] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569237] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569539] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.service_type = accelerator {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569539] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569539] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569668] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.569816] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570017] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570232] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] cyborg.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570437] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.backend = sqlalchemy {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570623] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.connection = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570797] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.connection_debug = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.570970] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.connection_parameters = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571163] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.connection_recycle_time = 3600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571341] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.connection_trace = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571505] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.db_inc_retry_interval = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571824] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.db_max_retries = 20 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571824] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.db_max_retry_interval = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.571983] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.db_retry_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.572172] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.max_overflow = 50 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.572369] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.max_pool_size = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.572525] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.max_retries = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.572694] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.mysql_sql_mode = TRADITIONAL {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.572853] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.mysql_wsrep_sync_wait = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573030] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.pool_timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573238] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.retry_interval = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573407] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.slave_connection = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573581] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.sqlite_synchronous = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573738] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] database.use_db_reconnect = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.573921] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.backend = sqlalchemy {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574115] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.connection = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574291] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.connection_debug = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574462] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.connection_parameters = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574625] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.connection_recycle_time = 3600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574791] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.connection_trace = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.574951] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.db_inc_retry_interval = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575131] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.db_max_retries = 20 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575297] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.db_max_retry_interval = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575457] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.db_retry_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575626] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.max_overflow = 50 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575786] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.max_pool_size = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.575953] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.max_retries = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576157] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576334] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.mysql_wsrep_sync_wait = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576501] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.pool_timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576669] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.retry_interval = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576829] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.slave_connection = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.576994] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] api_database.sqlite_synchronous = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.577186] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] devices.enabled_mdev_types = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.577368] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.577533] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ephemeral_storage_encryption.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.577706] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ephemeral_storage_encryption.key_size = 512 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.577876] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.api_servers = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578050] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578217] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578386] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578547] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578707] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.578868] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.debug = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579042] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.default_trusted_certificate_ids = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579238] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.enable_certificate_validation = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579409] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.enable_rbd_download = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579570] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579739] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.579899] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.580088] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.580312] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.580501] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.num_retries = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.580676] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.rbd_ceph_conf = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.580842] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.rbd_connect_timeout = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581024] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.rbd_pool = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581203] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.rbd_user = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581369] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581528] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581698] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.service_type = image {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.581861] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582030] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582194] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582354] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582537] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582700] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.verify_glance_signatures = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.582858] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] glance.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583032] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] guestfs.debug = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583209] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.config_drive_cdrom = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583373] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.config_drive_inject_password = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583540] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583699] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.enable_instance_metrics_collection = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.583859] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.enable_remotefx = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584036] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.instances_path_share = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584207] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.iscsi_initiator_list = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584372] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.limit_cpu_features = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584533] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584691] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.584849] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.power_state_check_timeframe = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585023] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.power_state_event_polling_interval = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585200] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585366] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.use_multipath_io = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585525] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.volume_attach_retry_count = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585682] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.volume_attach_retry_interval = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585840] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.vswitch_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.585999] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.586214] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] mks.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.586558] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.586747] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.manager_interval = 2400 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.586916] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.precache_concurrency = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587104] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.remove_unused_base_images = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587281] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587454] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587631] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] image_cache.subdirectory_name = _base {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587880] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.api_max_retries = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.587977] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.api_retry_interval = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.588150] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.588343] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.auth_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.588611] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.588880] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589117] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589314] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.conductor_group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589484] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589649] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589810] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.589974] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590183] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590346] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590511] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590675] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.peer_list = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590834] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.590999] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.serial_console_state_timeout = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591175] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591352] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.service_type = baremetal {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591515] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591671] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591828] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.591983] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.592181] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.592383] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ironic.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.592580] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.592758] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] key_manager.fixed_key = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.592944] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593121] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.barbican_api_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593286] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.barbican_endpoint = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593459] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.barbican_endpoint_type = public {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593617] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.barbican_region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593832] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.593930] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594103] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594270] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594427] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594588] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.number_of_retries = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594747] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.retry_delay = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.594907] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.send_service_user_token = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595079] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595241] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595399] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.verify_ssl = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595556] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican.verify_ssl_path = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595719] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.595882] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.auth_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596051] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596214] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596376] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596533] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596688] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.596848] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597017] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] barbican_service_user.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597215] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.approle_role_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597383] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.approle_secret_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597540] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597696] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.597858] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598032] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598197] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598373] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.kv_mountpoint = secret {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598532] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.kv_path = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598694] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.kv_version = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.598850] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.namespace = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599026] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.root_token_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599216] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599385] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.ssl_ca_crt_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599543] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599705] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.use_ssl = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.599873] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600053] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600265] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.auth_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600437] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600599] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600761] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.600920] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601090] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601256] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601419] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601574] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601730] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.601881] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602046] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602206] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602376] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.service_type = identity {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602537] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602693] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.602851] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603021] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603202] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603365] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] keystone.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603565] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.connection_uri = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603727] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_mode = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.603961] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_model_extra_flags = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604070] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_models = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604245] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_power_governor_high = performance {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604452] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_power_governor_low = powersave {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604626] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_power_management = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604798] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.604961] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.device_detach_attempts = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605139] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.device_detach_timeout = 20 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605306] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.disk_cachemodes = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605465] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.disk_prefix = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605631] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.enabled_perf_events = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605793] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.file_backed_memory = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.605954] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.gid_maps = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606125] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.hw_disk_discard = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606284] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.hw_machine_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606452] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_rbd_ceph_conf = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606613] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606777] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.606944] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_rbd_glance_store_name = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607144] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_rbd_pool = rbd {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607336] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_type = default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607495] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.images_volume_group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607656] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.inject_key = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607815] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.inject_partition = -2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.607975] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.inject_password = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608154] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.iscsi_iface = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608319] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.iser_use_multipath = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608481] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_bandwidth = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608641] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_completion_timeout = 800 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608804] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_downtime = 500 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.608961] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_downtime_delay = 75 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.609177] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_downtime_steps = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.609351] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_inbound_addr = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.609518] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_permit_auto_converge = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.609680] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_permit_post_copy = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.609838] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_scheme = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610016] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_timeout_action = abort {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610230] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_tunnelled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610410] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_uri = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610574] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.live_migration_with_native_tls = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610732] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.max_queues = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.610892] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.mem_stats_period_seconds = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.611071] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.nfs_mount_options = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.611421] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.611591] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_aoe_discover_tries = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.611755] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_iser_scan_tries = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.611914] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_memory_encrypted_guests = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.612088] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_nvme_discover_tries = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.612256] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_pcie_ports = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.612419] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.num_volume_scan_tries = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.612582] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.pmem_namespaces = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.612738] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.quobyte_client_cfg = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613044] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613222] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rbd_connect_timeout = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613390] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613552] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613711] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rbd_secret_uuid = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.613867] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rbd_user = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614084] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.realtime_scheduler_priority = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614211] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.remote_filesystem_transport = ssh {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614371] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rescue_image_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614529] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rescue_kernel_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614686] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rescue_ramdisk_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.614856] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rng_dev_path = /dev/urandom {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.615020] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.rx_queue_size = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.615198] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.smbfs_mount_options = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.615482] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.615659] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.snapshot_compression = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.615822] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.snapshot_image_format = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616054] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616229] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.sparse_logical_volumes = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616428] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.swtpm_enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616617] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.swtpm_group = tss {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616790] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.swtpm_user = tss {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.616961] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.sysinfo_serial = unique {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.617248] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.tb_cache_size = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.617419] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.tx_queue_size = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.617585] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.uid_maps = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.617749] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.use_virtio_for_bridges = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.617922] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.virt_type = kvm {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618110] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.volume_clear = zero {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618303] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.volume_clear_size = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618488] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.volume_use_multipath = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618659] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_cache_path = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618822] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.618991] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_mount_group = qemu {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.619205] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_mount_opts = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.619393] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.619675] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.619857] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.vzstorage_mount_user = stack {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620037] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620221] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620396] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.auth_type = password {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620559] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620720] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.620884] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621053] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621221] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621393] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.default_floating_pool = public {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621553] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621716] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.extension_sync_interval = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.621876] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.http_retries = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622048] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622214] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622377] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622550] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.metadata_proxy_shared_secret = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622710] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.622881] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.ovs_bridge = br-int {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623059] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.physnets = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623235] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.region_name = RegionOne {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623408] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.service_metadata_proxy = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623569] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623738] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.service_type = network {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.623903] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624074] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624243] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624401] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624580] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624741] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] neutron.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.624913] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] notifications.bdms_in_notifications = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625104] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] notifications.default_level = INFO {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625284] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] notifications.notification_format = unversioned {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625448] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] notifications.notify_on_state_change = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625622] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625797] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] pci.alias = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.625968] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] pci.device_spec = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626149] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] pci.report_in_placement = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626328] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626502] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.auth_type = password {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626671] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.auth_url = http://10.180.1.21/identity {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626840] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.626999] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.627201] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.627370] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.627529] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.627688] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.default_domain_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.627847] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.default_domain_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628011] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.domain_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628178] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.domain_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628340] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628502] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628659] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628817] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.628973] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.629182] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.password = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.629357] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.project_domain_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.629526] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.project_domain_name = Default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.629693] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.project_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.629864] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.project_name = service {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630073] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.region_name = RegionOne {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630454] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.service_type = placement {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630621] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630781] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.630943] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631119] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.system_scope = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631283] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631445] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.trust_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631602] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.user_domain_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631773] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.user_domain_name = Default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.631932] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.user_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632119] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.username = placement {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632306] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632467] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] placement.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632641] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.cores = 20 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632808] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.count_usage_from_placement = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.632980] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.633196] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.injected_file_content_bytes = 10240 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.633374] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.injected_file_path_length = 255 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.633540] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.injected_files = 5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.633706] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.instances = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.633873] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.key_pairs = 100 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634049] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.metadata_items = 128 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634222] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.ram = 51200 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634390] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.recheck_quota = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634557] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.server_group_members = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634721] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] quota.server_groups = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.634888] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rdp.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.635223] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.635413] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.635586] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.635751] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.image_metadata_prefilter = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.635915] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.636137] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.max_attempts = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.636346] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.max_placement_results = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.636522] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.636692] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.query_placement_for_image_type_support = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.636854] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637041] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] scheduler.workers = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637257] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637443] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637625] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637798] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.637965] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.638146] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.638317] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.638503] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.638672] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.host_subset_size = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.638837] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639000] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.image_properties_default_architecture = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639222] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639406] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.isolated_hosts = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639574] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.isolated_images = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639740] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.max_instances_per_host = 50 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.639903] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640097] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640287] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.pci_in_placement = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640456] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640621] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640789] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.640951] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641131] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641299] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641463] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.track_instance_changes = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641639] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641808] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metrics.required = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.641971] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metrics.weight_multiplier = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.642177] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metrics.weight_of_unavailable = -10000.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.642376] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] metrics.weight_setting = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.642675] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.642850] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643038] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.port_range = 10000:20000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643218] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643390] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643555] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] serial_console.serialproxy_port = 6083 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643724] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.643898] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.auth_type = password {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644071] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644238] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644439] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644555] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644712] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.644895] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.send_service_user_token = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.645073] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.645261] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] service_user.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.645448] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.agent_enabled = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.645613] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.645907] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646116] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.html5proxy_host = 0.0.0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646296] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.html5proxy_port = 6082 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646460] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.image_compression = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646622] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.jpeg_compression = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646782] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.playback_compression = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.646955] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.server_listen = 127.0.0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.647142] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.647306] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.streaming_mode = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.647462] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] spice.zlib_compression = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] upgrade_levels.baseapi = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] upgrade_levels.cert = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] upgrade_levels.compute = auto {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] upgrade_levels.conductor = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648275] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] upgrade_levels.scheduler = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648727] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648727] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.auth_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648787] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.648909] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vendordata_dynamic_auth.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.650607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.api_retry_count = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.ca_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.cache_prefix = devstack-image-cache {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.cluster_name = testcl1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.connection_pool_size = 10 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.console_delay_seconds = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651220] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.datastore_regex = ^datastore.* {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651491] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651491] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.host_password = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651491] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.host_port = 443 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651635] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.host_username = administrator@vsphere.local {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651810] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.insecure = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.651974] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.integration_bridge = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652153] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.maximum_objects = 100 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652316] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.pbm_default_policy = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652477] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.pbm_enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652637] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.pbm_wsdl_location = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652806] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.652965] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.serial_port_proxy_uri = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653139] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.serial_port_service_uri = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653330] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.task_poll_interval = 0.5 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653476] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.use_linked_clone = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653644] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.vnc_keymap = en-us {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653808] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.vnc_port = 5900 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.653969] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vmware.vnc_port_total = 10000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.654177] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.auth_schemes = ['none'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.654400] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.654704] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.654891] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655079] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.novncproxy_port = 6080 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655266] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.server_listen = 127.0.0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655442] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655604] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.vencrypt_ca_certs = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655765] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.vencrypt_client_cert = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.655924] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vnc.vencrypt_client_key = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656116] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656285] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.disable_fallback_pcpu_query = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656447] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.disable_group_policy_check_upcall = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656765] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.disable_rootwrap = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.656926] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.enable_numa_live_migration = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657133] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657305] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657469] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.handle_virt_lifecycle_events = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657631] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.libvirt_disable_apic = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657790] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.never_download_image_if_on_rbd = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.657951] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658126] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658290] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658448] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658607] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658764] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.658919] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659110] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659295] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659483] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659654] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.client_socket_timeout = 900 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659820] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.default_pool_size = 1000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.659989] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.keep_alive = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660215] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.max_header_line = 16384 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660392] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.secure_proxy_ssl_header = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660558] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.ssl_ca_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660722] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.ssl_cert_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.660885] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.ssl_key_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661066] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.tcp_keepidle = 600 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661251] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661421] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] zvm.ca_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661584] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] zvm.cloud_connector_url = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.661869] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662057] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] zvm.reachable_timeout = 300 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662300] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.enforce_new_defaults = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662498] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.enforce_scope = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662680] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.policy_default_rule = default {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.662863] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663068] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.policy_file = policy.yaml {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663270] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663437] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663598] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663759] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.663919] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664100] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664283] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664463] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.connection_string = messaging:// {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664664] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.enabled = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664796] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.es_doc_type = notification {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.664958] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.es_scroll_size = 10000 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665142] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.es_scroll_time = 2m {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665367] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.filter_error_trace = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665593] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.hmac_keys = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665776] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.sentinel_service_name = mymaster {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.665949] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.socket_timeout = 0.1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666155] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.trace_requests = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666339] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler.trace_sqlalchemy = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666520] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler_jaeger.process_tags = {} {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666683] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler_jaeger.service_name_prefix = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.666848] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] profiler_otlp.service_name_prefix = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667024] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] remote_debug.host = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667194] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] remote_debug.port = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667378] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667543] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667703] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.667866] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668041] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668228] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668422] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668591] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668756] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.668916] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669143] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669330] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669502] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669670] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.669834] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670017] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670187] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670352] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670517] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670681] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.670842] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671016] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671189] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671354] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671521] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671688] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.671862] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672045] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672236] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672425] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672600] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_rabbit.ssl_version = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672792] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.672959] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_notifications.retry = -1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673162] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673343] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_messaging_notifications.transport_url = **** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673515] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.auth_section = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673676] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.auth_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673834] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.cafile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.673993] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.certfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674169] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.collect_timing = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674329] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.connect_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674487] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.connect_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674644] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.endpoint_id = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674801] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.endpoint_override = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.674957] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.insecure = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675125] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.keyfile = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675314] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.max_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675476] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.min_version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675630] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.region_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675785] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.service_name = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.675938] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.service_type = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676108] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.split_loggers = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676268] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.status_code_retries = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676425] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.status_code_retry_delay = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676579] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.timeout = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676734] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.valid_interfaces = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.676889] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_limit.version = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677467] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_reports.file_event_handler = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677467] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_reports.file_event_handler_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677467] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] oslo_reports.log_dir = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677598] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677740] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.677901] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678081] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678276] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678454] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_linux_bridge_privileged.user = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678623] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678783] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.678941] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.helper_command = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679144] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679327] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679488] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] vif_plug_ovs_privileged.user = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679657] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.flat_interface = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.679837] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680014] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680226] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680408] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680574] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680741] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.680904] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_linux_bridge.vlan_interface = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681092] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681292] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.isolate_vif = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681476] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681643] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681814] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.681985] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.ovsdb_interface = native {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682164] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_vif_ovs.per_port_bridge = False {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682332] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_brick.lock_path = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682494] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682651] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] os_brick.wait_mpath_device_interval = 1 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682817] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.capabilities = [21] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.682973] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683144] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.helper_command = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683313] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683475] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.thread_pool_size = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683629] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] privsep_osbrick.user = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683797] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.683953] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.group = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684122] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.helper_command = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684307] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684485] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.thread_pool_size = 8 {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684643] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] nova_sys_admin.user = None {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 514.684772] env[68492]: DEBUG oslo_service.service [None req-55cbf5e5-3551-4d60-bed8-7aac6c17e8ee None None] ******************************************************************************** {{(pid=68492) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 514.685204] env[68492]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 514.695597] env[68492]: WARNING nova.virt.vmwareapi.driver [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 514.696062] env[68492]: INFO nova.virt.node [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Generated node identity dba0d66f-84ca-40a4-90ee-609cf684af11 [ 514.696379] env[68492]: INFO nova.virt.node [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Wrote node identity dba0d66f-84ca-40a4-90ee-609cf684af11 to /opt/stack/data/n-cpu-1/compute_id [ 514.708492] env[68492]: WARNING nova.compute.manager [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Compute nodes ['dba0d66f-84ca-40a4-90ee-609cf684af11'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 514.740186] env[68492]: INFO nova.compute.manager [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 514.760253] env[68492]: WARNING nova.compute.manager [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 514.760482] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.760686] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.760835] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 514.760988] env[68492]: DEBUG nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 514.762124] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ab8d95-46ab-413d-8750-5c5e79558ba8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.771010] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b11ca8c-89c2-465e-a3eb-c232f7e59ffe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.784634] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c2790c-171c-4442-b38e-0879ff683345 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.790735] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627f8ebb-1f3f-4f7b-934d-e64b9057f359 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.820408] env[68492]: DEBUG nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180971MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 514.820513] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 514.820683] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 514.831862] env[68492]: WARNING nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] No compute node record for cpu-1:dba0d66f-84ca-40a4-90ee-609cf684af11: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host dba0d66f-84ca-40a4-90ee-609cf684af11 could not be found. [ 514.844012] env[68492]: INFO nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: dba0d66f-84ca-40a4-90ee-609cf684af11 [ 514.894023] env[68492]: DEBUG nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 514.894232] env[68492]: DEBUG nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 514.997789] env[68492]: INFO nova.scheduler.client.report [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] [req-5dd2a32d-774a-45a6-805d-2ef88d515618] Created resource provider record via placement API for resource provider with UUID dba0d66f-84ca-40a4-90ee-609cf684af11 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 515.014215] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf128e03-b90b-4867-90ac-387acc759926 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.022626] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04c0e4e0-3f21-4c7a-9418-202a1c74fd1e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.052350] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2268ab63-4542-436c-baab-ab05e4bcc869 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.059352] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922e0c72-d371-4318-82fa-690c7bd2786d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 515.072153] env[68492]: DEBUG nova.compute.provider_tree [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 515.109254] env[68492]: DEBUG nova.scheduler.client.report [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Updated inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 515.109494] env[68492]: DEBUG nova.compute.provider_tree [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Updating resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 generation from 0 to 1 during operation: update_inventory {{(pid=68492) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 515.109639] env[68492]: DEBUG nova.compute.provider_tree [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 515.157684] env[68492]: DEBUG nova.compute.provider_tree [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Updating resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 generation from 1 to 2 during operation: update_traits {{(pid=68492) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 515.174340] env[68492]: DEBUG nova.compute.resource_tracker [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 515.174517] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 515.175012] env[68492]: DEBUG nova.service [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Creating RPC server for service compute {{(pid=68492) start /opt/stack/nova/nova/service.py:182}} [ 515.187832] env[68492]: DEBUG nova.service [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] Join ServiceGroup membership for this service compute {{(pid=68492) start /opt/stack/nova/nova/service.py:199}} [ 515.188305] env[68492]: DEBUG nova.servicegroup.drivers.db [None req-c60b1ff8-fb7a-46b4-9d7c-be4b824e2bb4 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=68492) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 524.521422] env[68492]: DEBUG dbcounter [-] [68492] Writing DB stats nova_cell0:SELECT=1 {{(pid=68492) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 524.523072] env[68492]: DEBUG dbcounter [-] [68492] Writing DB stats nova_cell1:SELECT=1 {{(pid=68492) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 552.330749] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 552.331758] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 552.352694] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 552.472388] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 552.472660] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 552.474570] env[68492]: INFO nova.compute.claims [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 552.623372] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3daf1b3-24df-440a-808a-d5b0d13001ad {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.634378] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b41472d8-c588-403d-8216-b7961023e13f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.668733] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38059cc-3bd1-48f3-be80-b46d571f6529 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.676359] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd4ddcf3-9cb6-49e2-a1bd-a8545ea6c929 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.692980] env[68492]: DEBUG nova.compute.provider_tree [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 552.707641] env[68492]: DEBUG nova.scheduler.client.report [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 552.730522] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 552.731043] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 552.779963] env[68492]: DEBUG nova.compute.utils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 552.783017] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Not allocating networking since 'none' was specified. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 552.793175] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 552.878542] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 555.068127] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 555.068127] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 555.068127] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 555.068596] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 555.068596] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 555.068596] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 555.068596] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 555.068596] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 555.074736] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 555.075027] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 555.075332] env[68492]: DEBUG nova.virt.hardware [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 555.077164] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2478034-fea4-49c1-a575-b9ee44a8ea13 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.089891] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217402a8-d8c8-4337-872f-ca4467b45509 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.115224] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1dceaf3-526c-47a1-b0a1-e0eee306e9f0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.136500] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Instance VIF info [] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.148064] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.148394] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8f612213-c402-4767-ba73-fc948f0d0e55 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.162701] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Created folder: OpenStack in parent group-v4. [ 555.163535] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating folder: Project (cb7b8a60d7d14b40abc2a592a72ac5bc). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.165019] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d5f3d8e-5c15-479c-b6a6-ef71c5407f42 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.175846] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Created folder: Project (cb7b8a60d7d14b40abc2a592a72ac5bc) in parent group-v677434. [ 555.176160] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating folder: Instances. Parent ref: group-v677435. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.176554] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2751c89-dfea-48f6-8162-5308f5389171 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.186535] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Created folder: Instances in parent group-v677435. [ 555.186793] env[68492]: DEBUG oslo.service.loopingcall [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 555.187130] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.187777] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7db0e1a8-ee7a-4819-973b-72f2ca1661e1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.206027] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.206027] env[68492]: value = "task-3395320" [ 555.206027] env[68492]: _type = "Task" [ 555.206027] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.215643] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395320, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 555.390881] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.390881] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.416157] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 555.493979] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.494118] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.495687] env[68492]: INFO nova.compute.claims [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 555.647797] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2705ef70-5a08-4169-b8a9-51da0d5dadfa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.655844] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db0fe51c-bcdc-4946-9aff-523b22bc6aa9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.691775] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cfb0c9d-feea-40a6-8e3c-57eb9770210a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.701028] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af46507e-41a0-475e-a3d9-681bc8facb2c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.721839] env[68492]: DEBUG nova.compute.provider_tree [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 555.735943] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395320, 'name': CreateVM_Task, 'duration_secs': 0.313661} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 555.736630] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 555.737513] env[68492]: DEBUG nova.scheduler.client.report [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 555.743301] env[68492]: DEBUG oslo_vmware.service [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8627abc-1697-4724-8ba0-4e8f0343800d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.752861] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.753057] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.753777] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 555.754852] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5d2f22a9-0c59-45d0-b2cf-8721f908d34c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.762377] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for the task: (returnval){ [ 555.762377] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52fd72d7-179b-3989-2d40-aa00831f7eec" [ 555.762377] env[68492]: _type = "Task" [ 555.762377] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.763078] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.763548] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 555.782463] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 555.782872] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 555.783134] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.783280] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.783737] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 555.783991] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46d51cb9-a2d3-436c-8454-01cce225fa32 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.796258] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 555.796439] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 555.797377] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eceb8d5-9abe-4c6b-9492-9e60d58f266b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.806801] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80aec18d-e6d3-4fee-bb1c-f243dd2e40ca {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.812470] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for the task: (returnval){ [ 555.812470] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5207255b-5aea-bd22-5d99-f6d4805fa112" [ 555.812470] env[68492]: _type = "Task" [ 555.812470] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.819542] env[68492]: DEBUG nova.compute.utils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 555.823620] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 555.823879] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 555.828904] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5207255b-5aea-bd22-5d99-f6d4805fa112, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 555.841829] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 555.940813] env[68492]: DEBUG nova.policy [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ba712884338c42809fcb88de5f5d2040', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9bf13155470c452c8b24b3051f187eb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 555.947170] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 555.987206] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 555.987454] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 555.987603] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 555.988245] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 555.988245] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 555.992531] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 555.992531] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 555.992531] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 555.992531] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 555.992531] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 555.992759] env[68492]: DEBUG nova.virt.hardware [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 555.993372] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39f93fe9-e196-4c6a-9c00-b0a06833325e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.001379] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d58714a7-0142-44ed-9218-50462ca75e0d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.332350] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 556.332636] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating directory with path [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 556.332878] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ef0d967-c535-426f-9264-42e7abf9e1f7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.359763] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Created directory with path [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 556.359849] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Fetch image to [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 556.360109] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 556.361054] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d0dd78f-c355-4246-9ef6-8a1bcece27f6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.373222] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19ca7336-c950-420a-b7e2-c4724407da64 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.386175] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d1c4de2-5ed6-4d63-8cdf-8cadcac1ec14 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.429927] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40195784-d080-448d-935c-dfbaf9799958 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.437934] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-736e3198-c834-40e0-973e-2c7f46d4bdf8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.528658] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 556.611869] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 556.676028] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 556.676804] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 556.728184] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Successfully created port: d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 558.479450] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Successfully updated port: d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.500878] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 558.501043] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquired lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 558.501181] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 558.562127] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 558.958891] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Updating instance_info_cache with network_info: [{"id": "d5676a36-7b97-477b-a65a-c799f7346940", "address": "fa:16:3e:c1:bc:d4", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5676a36-7b", "ovs_interfaceid": "d5676a36-7b97-477b-a65a-c799f7346940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 558.978024] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Releasing lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 558.978362] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Instance network_info: |[{"id": "d5676a36-7b97-477b-a65a-c799f7346940", "address": "fa:16:3e:c1:bc:d4", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5676a36-7b", "ovs_interfaceid": "d5676a36-7b97-477b-a65a-c799f7346940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 558.979071] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:bc:d4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd5676a36-7b97-477b-a65a-c799f7346940', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 558.989651] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Creating folder: Project (9bf13155470c452c8b24b3051f187eb5). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.990426] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a5fe4e8c-03c6-471a-a6ee-20f0258ad06b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.002848] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Created folder: Project (9bf13155470c452c8b24b3051f187eb5) in parent group-v677434. [ 559.002848] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Creating folder: Instances. Parent ref: group-v677438. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.003098] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0127daf4-cd5b-4efe-8ae2-e7cd18298d1e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.012446] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Created folder: Instances in parent group-v677438. [ 559.012715] env[68492]: DEBUG oslo.service.loopingcall [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 559.012796] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.012987] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eab3b365-5ea4-483e-8098-ba1812150728 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.037740] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.037740] env[68492]: value = "task-3395323" [ 559.037740] env[68492]: _type = "Task" [ 559.037740] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.046944] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395323, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.492595] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "02050238-c4a5-4c06-952d-06af14ff7d35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 559.493418] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "02050238-c4a5-4c06-952d-06af14ff7d35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 559.514025] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 559.551667] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395323, 'name': CreateVM_Task, 'duration_secs': 0.296932} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 559.551850] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 559.579178] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.579400] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 559.579944] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 559.580216] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3a67d00-d8c5-45c9-ab0f-61e5cf109158 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.587841] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Waiting for the task: (returnval){ [ 559.587841] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e8d3fd-4929-ff34-5af3-74d126907ddb" [ 559.587841] env[68492]: _type = "Task" [ 559.587841] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.602321] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e8d3fd-4929-ff34-5af3-74d126907ddb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.605325] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 559.606437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 559.609727] env[68492]: INFO nova.compute.claims [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 559.748928] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f68dba5-4910-4bdb-9ca6-ceb4ad5d80b8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.757535] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98608b50-9a98-4b70-b0ab-3b5f9b027314 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.800419] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d37b3bae-8b36-4ccc-a123-b480d75c2a21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.808542] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad252ca4-9e16-4254-80a0-376c4e107f66 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.825123] env[68492]: DEBUG nova.compute.provider_tree [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 559.836474] env[68492]: DEBUG nova.scheduler.client.report [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 559.856294] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 559.856889] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 559.905747] env[68492]: DEBUG nova.compute.utils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 559.908709] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 559.909614] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 559.921999] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 560.009128] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 560.043261] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 560.043517] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 560.043671] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 560.043848] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 560.043994] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 560.044163] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 560.044371] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 560.044532] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 560.044700] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 560.044858] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 560.045050] env[68492]: DEBUG nova.virt.hardware [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 560.046758] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce83d55-410a-486b-9825-7ef1e6f5e605 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.057931] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5fe8045-0413-4dd8-8462-c3400d6b5c53 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.097875] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 560.098244] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 560.098570] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.189502] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 560.214460] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 560.214460] env[68492]: value = "domain-c8" [ 560.214460] env[68492]: _type = "ClusterComputeResource" [ 560.214460] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 560.215910] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5af31918-e168-4d78-84d8-b1713538889c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.231250] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 2 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 560.231250] env[68492]: WARNING nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] While synchronizing instance power states, found 3 instances in the database and 2 instances on the hypervisor. [ 560.231684] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid d1d77916-2250-4bce-a3c1-50a2dda3627f {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 560.232548] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid b1180e4b-9e82-42e3-867c-b4a757ca6f14 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 560.232548] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 02050238-c4a5-4c06-952d-06af14ff7d35 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 560.232976] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.233062] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.233273] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "02050238-c4a5-4c06-952d-06af14ff7d35" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.233498] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 560.235313] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 560.235313] env[68492]: value = "domain-c8" [ 560.235313] env[68492]: _type = "ClusterComputeResource" [ 560.235313] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 560.237146] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-541a3f2b-dfb1-4f76-81b3-e2d48169e576 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.249336] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 2 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 560.395093] env[68492]: DEBUG nova.policy [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38a6e59c63824d38b6ba08f38f340969', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '169f6cf2abfb4e64b93f8cbb5449ec9a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 560.697853] env[68492]: DEBUG nova.compute.manager [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Received event network-vif-plugged-d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 560.699062] env[68492]: DEBUG oslo_concurrency.lockutils [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] Acquiring lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.699062] env[68492]: DEBUG oslo_concurrency.lockutils [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 560.699062] env[68492]: DEBUG oslo_concurrency.lockutils [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 560.699062] env[68492]: DEBUG nova.compute.manager [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] No waiting events found dispatching network-vif-plugged-d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 560.699273] env[68492]: WARNING nova.compute.manager [req-773524a5-dcd9-4457-8480-841133c6a04c req-0457e6d9-6dc1-42b9-8ace-27da82662f1d service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Received unexpected event network-vif-plugged-d5676a36-7b97-477b-a65a-c799f7346940 for instance with vm_state building and task_state spawning. [ 561.818961] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "3de34725-4b54-4956-b2b6-285c9138e94c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.818961] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.833516] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 561.935055] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.935326] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.936820] env[68492]: INFO nova.compute.claims [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 561.966106] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Successfully created port: 3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 561.994255] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "e9f787fc-98be-4086-9b70-ebbf33e31d13" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.994255] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "e9f787fc-98be-4086-9b70-ebbf33e31d13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.011686] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 562.086665] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.124745] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca034cb8-022f-4a41-a142-59a31752ec57 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.140788] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0febd077-ec30-429c-b6c5-7736534cded5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.185099] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e1a6699-55b1-4f19-b5e4-ca0869dab830 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.193819] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f81a93e-3ca4-407b-99c2-0fc2173cee0a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.218577] env[68492]: DEBUG nova.compute.provider_tree [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 562.229344] env[68492]: DEBUG nova.scheduler.client.report [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 562.248900] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 562.249466] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 562.253119] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.168s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.254770] env[68492]: INFO nova.compute.claims [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 562.308362] env[68492]: DEBUG nova.compute.utils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 562.310673] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 562.310856] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 562.328433] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 562.448599] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 562.483923] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54eaf7f2-c6cb-4ff3-858e-9f97bfb53842 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.490911] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 562.491898] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 562.492102] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 562.492300] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 562.492444] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 562.492585] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 562.493549] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 562.493765] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 562.493944] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 562.494142] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 562.494321] env[68492]: DEBUG nova.virt.hardware [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 562.495496] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ba1c10-a0fc-4d3e-b9e5-ebefff4b1103 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.509931] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce71279c-8c13-4e8a-af5a-4f9976ef7665 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.515010] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-044e72f1-0317-45fd-9fd2-3385256e7cf1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.558559] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e8174f-72cd-4f91-b438-df029f1ba3ac {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.566547] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4eb3273-45c3-4171-ae29-5de343fcab55 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.584348] env[68492]: DEBUG nova.compute.provider_tree [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 562.605318] env[68492]: DEBUG nova.scheduler.client.report [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 562.630906] env[68492]: DEBUG nova.policy [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1581cf3c5f1473dbce8c123aab15c5d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e88b0740987944eaae8fe55d4434ceb7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 562.635716] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.383s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 562.636838] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 562.702630] env[68492]: DEBUG nova.compute.utils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 562.703880] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Not allocating networking since 'none' was specified. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 562.719455] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 562.888364] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 562.926824] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 562.927133] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 562.927254] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 562.927451] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 562.927544] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 562.927681] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 562.927926] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 562.929124] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 562.929411] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 562.930686] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 562.930784] env[68492]: DEBUG nova.virt.hardware [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 562.932225] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feef4041-1293-4ed1-b345-4efd209fe236 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.944799] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ec2012e-b742-4ae8-9e92-5f0d71ebdda9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.964305] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Instance VIF info [] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 562.972375] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Creating folder: Project (cf81febcd9dc436da18c5f5c20a3fab1). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 562.973119] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a270abdf-b375-47d1-8ffc-093b382f3311 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.986262] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Created folder: Project (cf81febcd9dc436da18c5f5c20a3fab1) in parent group-v677434. [ 562.986262] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Creating folder: Instances. Parent ref: group-v677441. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 562.986262] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e109866d-8762-43fd-b78c-b12d1292d25c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.999906] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Created folder: Instances in parent group-v677441. [ 563.000285] env[68492]: DEBUG oslo.service.loopingcall [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 563.002304] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 563.002304] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7c2484dd-d3cd-4e8d-a9a9-d28344b4acfb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.018989] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 563.018989] env[68492]: value = "task-3395326" [ 563.018989] env[68492]: _type = "Task" [ 563.018989] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 563.028813] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395326, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 563.532027] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395326, 'name': CreateVM_Task, 'duration_secs': 0.27627} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 563.532027] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 563.532453] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 563.532624] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 563.532940] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 563.533246] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7667d7d0-fa56-4a1e-943b-fe19e1a9c2af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.538164] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for the task: (returnval){ [ 563.538164] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524f2ec7-907d-dfaa-c675-47a8514a185b" [ 563.538164] env[68492]: _type = "Task" [ 563.538164] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 563.547510] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524f2ec7-907d-dfaa-c675-47a8514a185b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 563.862700] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Successfully created port: 3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 564.050445] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 564.051053] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 564.051053] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.343388] env[68492]: DEBUG nova.compute.manager [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Received event network-changed-d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 564.343575] env[68492]: DEBUG nova.compute.manager [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Refreshing instance network info cache due to event network-changed-d5676a36-7b97-477b-a65a-c799f7346940. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 564.344730] env[68492]: DEBUG oslo_concurrency.lockutils [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] Acquiring lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.344730] env[68492]: DEBUG oslo_concurrency.lockutils [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] Acquired lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.344730] env[68492]: DEBUG nova.network.neutron [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Refreshing network info cache for port d5676a36-7b97-477b-a65a-c799f7346940 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 564.452999] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Successfully updated port: 3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 564.472822] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.472965] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquired lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.473157] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 564.608801] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 564.980705] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Updating instance_info_cache with network_info: [{"id": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "address": "fa:16:3e:09:8e:3f", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.185", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e521413-38", "ovs_interfaceid": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.003222] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Releasing lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 565.003518] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Instance network_info: |[{"id": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "address": "fa:16:3e:09:8e:3f", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.185", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e521413-38", "ovs_interfaceid": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 565.003897] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:09:8e:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3e521413-384c-4c5f-b0f4-07c077ee1cc1', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 565.017308] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Creating folder: Project (169f6cf2abfb4e64b93f8cbb5449ec9a). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 565.017869] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb468eb6-0481-4be6-9c5f-531b9f256419 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.031027] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Created folder: Project (169f6cf2abfb4e64b93f8cbb5449ec9a) in parent group-v677434. [ 565.031027] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Creating folder: Instances. Parent ref: group-v677444. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 565.031027] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7c6df41c-7400-441c-a230-5dad1f277121 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.044753] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Created folder: Instances in parent group-v677444. [ 565.044753] env[68492]: DEBUG oslo.service.loopingcall [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 565.044753] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 565.044753] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-189979d3-7244-4647-8d6e-67e071e21884 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.070406] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 565.070406] env[68492]: value = "task-3395329" [ 565.070406] env[68492]: _type = "Task" [ 565.070406] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 565.078999] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395329, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 565.584111] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395329, 'name': CreateVM_Task, 'duration_secs': 0.32218} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 565.584313] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 565.584977] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 565.585216] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 565.585461] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 565.585707] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7de4ddf8-532f-489d-bae2-cb9ecba85660 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.592662] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Waiting for the task: (returnval){ [ 565.592662] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522d6759-7bfa-a0ed-1906-cea920b9061d" [ 565.592662] env[68492]: _type = "Task" [ 565.592662] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 565.606892] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522d6759-7bfa-a0ed-1906-cea920b9061d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 565.761324] env[68492]: DEBUG nova.network.neutron [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Updated VIF entry in instance network info cache for port d5676a36-7b97-477b-a65a-c799f7346940. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 565.761570] env[68492]: DEBUG nova.network.neutron [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Updating instance_info_cache with network_info: [{"id": "d5676a36-7b97-477b-a65a-c799f7346940", "address": "fa:16:3e:c1:bc:d4", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5676a36-7b", "ovs_interfaceid": "d5676a36-7b97-477b-a65a-c799f7346940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.778574] env[68492]: DEBUG oslo_concurrency.lockutils [req-38162944-6d69-4c11-a9be-62f36c6d7609 req-fdc041f5-43fb-46ce-a254-1f31ff690fb7 service nova] Releasing lock "refresh_cache-b1180e4b-9e82-42e3-867c-b4a757ca6f14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 566.105144] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Successfully updated port: 3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 566.111842] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 566.111842] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 566.111842] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 566.121069] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 566.121069] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquired lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 566.121069] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 566.196353] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 566.662391] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Updating instance_info_cache with network_info: [{"id": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "address": "fa:16:3e:24:61:d7", "network": {"id": "c205b2ab-712c-4a5d-8ac4-1e2cb8754b26", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-985707380-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e88b0740987944eaae8fe55d4434ceb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "721e64ee-fc02-4eb5-9c8c-ea55647a1b92", "external-id": "nsx-vlan-transportzone-621", "segmentation_id": 621, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ca2ba0a-10", "ovs_interfaceid": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 566.676096] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Releasing lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 566.676412] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance network_info: |[{"id": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "address": "fa:16:3e:24:61:d7", "network": {"id": "c205b2ab-712c-4a5d-8ac4-1e2cb8754b26", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-985707380-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e88b0740987944eaae8fe55d4434ceb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "721e64ee-fc02-4eb5-9c8c-ea55647a1b92", "external-id": "nsx-vlan-transportzone-621", "segmentation_id": 621, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ca2ba0a-10", "ovs_interfaceid": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 566.676811] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:61:d7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '721e64ee-fc02-4eb5-9c8c-ea55647a1b92', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 566.691612] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Creating folder: Project (e88b0740987944eaae8fe55d4434ceb7). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 566.693747] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0679db7a-cbbf-4e95-8140-76e95addbe29 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.706582] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Created folder: Project (e88b0740987944eaae8fe55d4434ceb7) in parent group-v677434. [ 566.706758] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Creating folder: Instances. Parent ref: group-v677447. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 566.706899] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-270095f9-a051-4b3a-a8af-bd2a961d63c8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.719473] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Created folder: Instances in parent group-v677447. [ 566.719735] env[68492]: DEBUG oslo.service.loopingcall [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 566.719958] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 566.720064] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-62c671cc-353c-4136-9eb4-e59b751ee0fa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.746821] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 566.746821] env[68492]: value = "task-3395332" [ 566.746821] env[68492]: _type = "Task" [ 566.746821] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 566.762376] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395332, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 567.257626] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395332, 'name': CreateVM_Task, 'duration_secs': 0.309841} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 567.257944] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 567.258527] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.258673] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.259039] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 567.259219] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e8313591-c5ba-4c8c-8530-e7271aa35cba {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.264233] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for the task: (returnval){ [ 567.264233] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d7f963-cd45-3621-9634-126f3c1ca4b2" [ 567.264233] env[68492]: _type = "Task" [ 567.264233] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 567.272764] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d7f963-cd45-3621-9634-126f3c1ca4b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 567.777192] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 567.777511] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 567.777982] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 568.055018] env[68492]: DEBUG nova.compute.manager [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Received event network-vif-plugged-3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 568.055018] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Acquiring lock "02050238-c4a5-4c06-952d-06af14ff7d35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 568.056216] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Lock "02050238-c4a5-4c06-952d-06af14ff7d35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 568.062060] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Lock "02050238-c4a5-4c06-952d-06af14ff7d35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 568.062060] env[68492]: DEBUG nova.compute.manager [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] No waiting events found dispatching network-vif-plugged-3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 568.062060] env[68492]: WARNING nova.compute.manager [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Received unexpected event network-vif-plugged-3e521413-384c-4c5f-b0f4-07c077ee1cc1 for instance with vm_state building and task_state spawning. [ 568.062060] env[68492]: DEBUG nova.compute.manager [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Received event network-changed-3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 568.062426] env[68492]: DEBUG nova.compute.manager [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Refreshing instance network info cache due to event network-changed-3e521413-384c-4c5f-b0f4-07c077ee1cc1. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 568.062426] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Acquiring lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 568.062426] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Acquired lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 568.062426] env[68492]: DEBUG nova.network.neutron [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Refreshing network info cache for port 3e521413-384c-4c5f-b0f4-07c077ee1cc1 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 568.426227] env[68492]: DEBUG nova.compute.manager [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Received event network-vif-plugged-3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 568.426486] env[68492]: DEBUG oslo_concurrency.lockutils [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] Acquiring lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 568.426626] env[68492]: DEBUG oslo_concurrency.lockutils [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] Lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 568.426803] env[68492]: DEBUG oslo_concurrency.lockutils [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] Lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 568.427208] env[68492]: DEBUG nova.compute.manager [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] No waiting events found dispatching network-vif-plugged-3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 568.427877] env[68492]: WARNING nova.compute.manager [req-4d36cc53-b059-43df-8b8d-411d7bb6259c req-86076310-86d5-4086-b8bf-51a0c0db4452 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Received unexpected event network-vif-plugged-3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 for instance with vm_state building and task_state spawning. [ 568.430934] env[68492]: DEBUG nova.network.neutron [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Updated VIF entry in instance network info cache for port 3e521413-384c-4c5f-b0f4-07c077ee1cc1. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 568.431728] env[68492]: DEBUG nova.network.neutron [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Updating instance_info_cache with network_info: [{"id": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "address": "fa:16:3e:09:8e:3f", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.185", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e521413-38", "ovs_interfaceid": "3e521413-384c-4c5f-b0f4-07c077ee1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 568.449646] env[68492]: DEBUG oslo_concurrency.lockutils [req-543b82bf-f1b0-4d3b-9eb2-899a4d122661 req-7aafff49-4cc8-4e90-bda8-59eb4c5aab93 service nova] Releasing lock "refresh_cache-02050238-c4a5-4c06-952d-06af14ff7d35" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 570.613857] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.614127] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.646228] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 570.752551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.752551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.752551] env[68492]: INFO nova.compute.claims [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 570.946352] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fcba723-174b-4a43-90e5-c0f85fd18435 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.958075] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71663714-d207-463c-a1be-0182d85faa3e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.996283] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e0691d-249b-43ae-acab-9ce576ad0dfd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.004218] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a317d979-fbd2-4629-aa6e-4315d793e451 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.018824] env[68492]: DEBUG nova.compute.provider_tree [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.031777] env[68492]: DEBUG nova.scheduler.client.report [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.057806] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.058340] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 571.109021] env[68492]: DEBUG nova.compute.utils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 571.110026] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Not allocating networking since 'none' was specified. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 571.120242] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 571.222483] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 571.275538] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.275813] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.276124] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 571.276212] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 571.299341] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.299522] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.299630] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.299803] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.299869] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.300244] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 571.300542] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 571.300993] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.301297] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.301489] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.301677] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.302205] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.302956] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.302956] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 571.303195] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 571.324711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.324826] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.325322] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.325500] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 571.326631] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4485b7b-4627-4570-aead-78d3921c16f8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.338365] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be0ca1f6-3ee0-459a-a89e-b20d7af09168 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.357848] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd629387-8d72-4ac9-8fb9-9ad372148329 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.366261] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8684dba8-02a7-440d-bc34-3f4d3d46239f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.402698] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180982MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 571.402866] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.403132] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.493869] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d1d77916-2250-4bce-a3c1-50a2dda3627f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494458] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b1180e4b-9e82-42e3-867c-b4a757ca6f14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494458] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 02050238-c4a5-4c06-952d-06af14ff7d35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494458] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3de34725-4b54-4956-b2b6-285c9138e94c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494458] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e9f787fc-98be-4086-9b70-ebbf33e31d13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494674] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 571.494711] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 571.494923] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 571.621605] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de95d80b-347b-4339-8b3f-ec128251753a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.631226] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a49db0f6-377a-4a6f-b464-ba0b52219f9d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.680968] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76c841e1-e04c-4f16-9644-0845df16e22d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.689062] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af38afb2-1f71-4893-a0f8-42a6ed7c6fd3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.708468] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.720902] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.739790] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 571.740027] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.908111] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.911212] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.912324] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.912324] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.912324] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.912324] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.912324] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.912571] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.912571] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.912571] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.912744] env[68492]: DEBUG nova.virt.hardware [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.913644] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-949e3ac3-aeb6-4b51-ae59-24f63a5a1cae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.932260] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4eaaff1-8568-4044-9871-62ee42b33d39 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.947026] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance VIF info [] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 571.952615] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Creating folder: Project (39ef613f7ae444b0b50dbd654389db61). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.952972] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cb87f23d-80da-4b55-817c-7d190c606301 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.964937] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Created folder: Project (39ef613f7ae444b0b50dbd654389db61) in parent group-v677434. [ 571.965233] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Creating folder: Instances. Parent ref: group-v677450. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.965395] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7a79c2ce-e46f-4afb-8f93-ccd215ce4714 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.975310] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Created folder: Instances in parent group-v677450. [ 571.975807] env[68492]: DEBUG oslo.service.loopingcall [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 571.975807] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 571.976014] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-984185d0-5a3e-4a6d-8155-c60ef3636db8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.994065] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 571.994065] env[68492]: value = "task-3395335" [ 571.994065] env[68492]: _type = "Task" [ 571.994065] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.007367] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395335, 'name': CreateVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.507898] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395335, 'name': CreateVM_Task, 'duration_secs': 0.304677} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 572.508173] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.508652] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.508816] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.509191] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 572.509452] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dcd864ec-6868-4883-b281-00f795207cb2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.514854] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for the task: (returnval){ [ 572.514854] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52311e55-a903-2605-c836-e3a25a617160" [ 572.514854] env[68492]: _type = "Task" [ 572.514854] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.524492] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52311e55-a903-2605-c836-e3a25a617160, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.888026] env[68492]: DEBUG nova.compute.manager [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Received event network-changed-3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 572.888026] env[68492]: DEBUG nova.compute.manager [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Refreshing instance network info cache due to event network-changed-3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 572.890602] env[68492]: DEBUG oslo_concurrency.lockutils [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] Acquiring lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.890602] env[68492]: DEBUG oslo_concurrency.lockutils [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] Acquired lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.890602] env[68492]: DEBUG nova.network.neutron [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Refreshing network info cache for port 3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 573.031648] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 573.031907] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.032145] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.672828] env[68492]: DEBUG nova.network.neutron [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Updated VIF entry in instance network info cache for port 3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 573.673199] env[68492]: DEBUG nova.network.neutron [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Updating instance_info_cache with network_info: [{"id": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "address": "fa:16:3e:24:61:d7", "network": {"id": "c205b2ab-712c-4a5d-8ac4-1e2cb8754b26", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-985707380-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e88b0740987944eaae8fe55d4434ceb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "721e64ee-fc02-4eb5-9c8c-ea55647a1b92", "external-id": "nsx-vlan-transportzone-621", "segmentation_id": 621, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ca2ba0a-10", "ovs_interfaceid": "3ca2ba0a-1057-4cce-8f7b-6823c2f6ef44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 573.713775] env[68492]: DEBUG oslo_concurrency.lockutils [req-7d026c04-b971-41ed-ba6c-c04e9060f46a req-efcadf18-3a1d-493c-bc11-46698becc3f5 service nova] Releasing lock "refresh_cache-3de34725-4b54-4956-b2b6-285c9138e94c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.696942] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.697249] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.717031] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 574.794411] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.798020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.798020] env[68492]: INFO nova.compute.claims [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.958754] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a381e2b3-ef28-4fd7-96b6-0dcd33eb8903 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.968725] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6c3664-300b-48a4-ba90-7c99720408e8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.004220] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4ff649d-dced-4dd7-961b-e4c8d152a530 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.012362] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-133a0af5-c5fc-4585-8eba-d91c1b911b26 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.028159] env[68492]: DEBUG nova.compute.provider_tree [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.041524] env[68492]: DEBUG nova.scheduler.client.report [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.059898] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.060458] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 575.123027] env[68492]: DEBUG nova.compute.utils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 575.125303] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 575.125687] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 575.143605] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 575.252941] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 575.293237] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 575.293538] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 575.293618] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 575.293776] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 575.293916] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 575.294078] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 575.294287] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 575.294445] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 575.294606] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 575.294766] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 575.294940] env[68492]: DEBUG nova.virt.hardware [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 575.296562] env[68492]: DEBUG nova.policy [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae12ba5644da4cf99138f90612514431', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df5cff4632c44e188abd1b60a3eecedd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 575.304164] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29f73158-f29f-4117-81ff-383268ac01bb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.321276] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-171d5cda-57e8-45f9-9dc1-d4d5b85a7606 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.705050] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "b7e0d1c7-d21b-42c1-b400-86be946df689" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.705372] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.726845] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 575.794245] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.795944] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.796561] env[68492]: INFO nova.compute.claims [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 575.834920] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Successfully created port: 598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 576.036326] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b84c5cc-046e-4aea-9b55-201768ef8f52 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.048211] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9622659a-7496-4834-9472-91dba301b83f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.093999] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3cb3d86-e5cb-4bb9-918d-e4d36f3b319d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.105472] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d0f6340-e474-4d08-9d97-e74bfc312bc8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.121171] env[68492]: DEBUG nova.compute.provider_tree [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 576.142750] env[68492]: DEBUG nova.scheduler.client.report [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 576.159099] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 576.159562] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 576.218026] env[68492]: DEBUG nova.compute.utils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 576.220262] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 576.220477] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 576.232860] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 576.312585] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 576.323185] env[68492]: DEBUG nova.policy [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6046267ac804199963fb7ce6ba465ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0143a0b112cc48c5b4696b3c06d04e73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 576.358124] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 576.358124] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 576.358278] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 576.358449] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 576.358566] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 576.358866] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 576.358933] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 576.360219] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 576.360490] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 576.360671] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 576.360876] env[68492]: DEBUG nova.virt.hardware [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 576.361868] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c3876ae-9247-453c-b646-d2b141bef1bf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.374450] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd75dd5-d721-44b9-9890-6eb7d881072d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.040551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "12450355-d90e-40dc-b66f-6105ec320d19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.042214] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.058512] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 577.120761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.120761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.122557] env[68492]: INFO nova.compute.claims [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 577.164022] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Successfully updated port: 598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 577.175304] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.175518] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.175650] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 577.263679] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Successfully created port: 0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 577.270520] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 577.382760] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b50802fe-2b2b-44d4-847c-ebdde39eff61 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.390650] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceb94f00-082e-407f-8f2b-80ccea18254d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.428760] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db87c0b0-dff8-45b3-8530-d7a783677c17 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.438082] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa321763-f38d-4cb6-b58f-4e903dd26e67 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.451837] env[68492]: DEBUG nova.compute.provider_tree [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 577.461313] env[68492]: DEBUG nova.scheduler.client.report [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 577.476385] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 577.477177] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 577.521222] env[68492]: DEBUG nova.compute.utils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 577.523109] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 577.523278] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 577.542611] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 577.630426] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 577.658450] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 577.659816] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 577.659816] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 577.659816] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 577.659816] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 577.659816] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 577.660315] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 577.660315] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 577.660315] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 577.660315] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 577.662630] env[68492]: DEBUG nova.virt.hardware [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 577.663553] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fae594b-21c7-47ea-9511-dc64d36361c2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.673703] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6cb0083-7d94-4b37-b27b-a8618b31ad8c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.890485] env[68492]: DEBUG nova.policy [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e0f28c524601422792463cbe532e44ad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8d0bf43012c42e1902c054df4ea4e1f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 578.137323] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Updating instance_info_cache with network_info: [{"id": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "address": "fa:16:3e:ce:60:6a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap598168e3-27", "ovs_interfaceid": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 578.159135] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 578.159135] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance network_info: |[{"id": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "address": "fa:16:3e:ce:60:6a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap598168e3-27", "ovs_interfaceid": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 578.159283] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:60:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '598168e3-27c1-4bd9-8974-c1829fcd2bb0', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 578.170526] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating folder: Project (df5cff4632c44e188abd1b60a3eecedd). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.172893] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e46c6ee6-4ca8-4c64-bb9c-6a123d94a0bd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.182649] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created folder: Project (df5cff4632c44e188abd1b60a3eecedd) in parent group-v677434. [ 578.182851] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating folder: Instances. Parent ref: group-v677453. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.183146] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f69b53c9-e524-4317-b9f4-2648a9077169 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.194686] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created folder: Instances in parent group-v677453. [ 578.194831] env[68492]: DEBUG oslo.service.loopingcall [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 578.195043] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 578.195257] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-52eacb95-4a31-48cc-919b-4557b59b969d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.214609] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 578.214609] env[68492]: value = "task-3395338" [ 578.214609] env[68492]: _type = "Task" [ 578.214609] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 578.224984] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395338, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 578.730191] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395338, 'name': CreateVM_Task, 'duration_secs': 0.395703} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 578.730191] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 578.730191] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 578.730191] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 578.730893] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 578.730893] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-236fd983-ad43-418b-a5f3-6d0ed42c604e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.739580] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 578.739580] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c93d3d-6a6c-2653-334b-d9171000d370" [ 578.739580] env[68492]: _type = "Task" [ 578.739580] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 578.747972] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c93d3d-6a6c-2653-334b-d9171000d370, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.255143] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.255143] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 579.255143] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.628907] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Successfully created port: 7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 579.785894] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "acbc1e36-0803-44ff-8ebc-094083193bc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.786161] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.802078] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 579.875253] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.875530] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.876994] env[68492]: INFO nova.compute.claims [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 579.934692] env[68492]: DEBUG nova.compute.manager [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Received event network-vif-plugged-598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 579.935068] env[68492]: DEBUG oslo_concurrency.lockutils [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] Acquiring lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.935403] env[68492]: DEBUG oslo_concurrency.lockutils [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.935403] env[68492]: DEBUG oslo_concurrency.lockutils [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 579.935626] env[68492]: DEBUG nova.compute.manager [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] No waiting events found dispatching network-vif-plugged-598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 579.935711] env[68492]: WARNING nova.compute.manager [req-07241ebb-a6bb-4545-b3cd-871eb881f15c req-9196a63f-2411-4992-950b-8ae6cb8a3f1d service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Received unexpected event network-vif-plugged-598168e3-27c1-4bd9-8974-c1829fcd2bb0 for instance with vm_state building and task_state spawning. [ 579.951760] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Successfully updated port: 0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 579.976341] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.976341] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquired lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.976341] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 580.101353] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 580.117513] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f27edf7-1a36-491d-9e3c-7cc39ca3608a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.125961] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e78c4186-819d-455c-a9e9-5c2badd116d0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.158353] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d3d7c45-f0a0-449d-8c95-df11747a091b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.170436] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a388c26-166a-45df-a781-afba0972523b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.184474] env[68492]: DEBUG nova.compute.provider_tree [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 580.196010] env[68492]: DEBUG nova.scheduler.client.report [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 580.212281] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.212788] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 580.264051] env[68492]: DEBUG nova.compute.utils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 580.265300] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 580.265468] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 580.275450] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 580.835529] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 580.862103] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 580.862374] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 580.862551] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 580.862658] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 580.862794] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 580.862955] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 580.863216] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 580.863347] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 580.863507] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 580.863667] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 580.863829] env[68492]: DEBUG nova.virt.hardware [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 580.864778] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1cd929-30e5-4aa3-b912-04d3afc90f4e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.873061] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a209635-d823-4c9a-9c39-ada91b7cf0ed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.041189] env[68492]: DEBUG nova.policy [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7938ef239a0c4ae29febdd7ecf1cde37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3d54feaed07492da952b05c788f99f9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 581.311489] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Updating instance_info_cache with network_info: [{"id": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "address": "fa:16:3e:50:2b:32", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.62", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d91df47-3b", "ovs_interfaceid": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.326719] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Releasing lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.327156] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance network_info: |[{"id": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "address": "fa:16:3e:50:2b:32", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.62", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d91df47-3b", "ovs_interfaceid": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 581.327418] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:50:2b:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0d91df47-3bcf-41ae-abc6-b32f973c86a6', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.338422] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Creating folder: Project (0143a0b112cc48c5b4696b3c06d04e73). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.338569] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-01a1aec5-8ac4-4d09-b0f7-e2480a6e8bc6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.354225] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Created folder: Project (0143a0b112cc48c5b4696b3c06d04e73) in parent group-v677434. [ 581.354442] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Creating folder: Instances. Parent ref: group-v677456. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.354666] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1c2e4c5b-72df-48c6-afea-ae82dd03f094 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.366859] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Created folder: Instances in parent group-v677456. [ 581.367674] env[68492]: DEBUG oslo.service.loopingcall [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 581.367674] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.367674] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca20f1b6-caac-4d54-9fc0-247eff57a4f8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.390956] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.390956] env[68492]: value = "task-3395341" [ 581.390956] env[68492]: _type = "Task" [ 581.390956] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.400549] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395341, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 581.907509] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395341, 'name': CreateVM_Task, 'duration_secs': 0.309681} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 581.907954] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.909193] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 581.909512] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 581.910793] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 581.910793] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e1693ac-d321-457f-b4e7-219f710e3f9c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.916619] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for the task: (returnval){ [ 581.916619] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52442337-c94b-fbed-4ed8-bda8f62ae4fa" [ 581.916619] env[68492]: _type = "Task" [ 581.916619] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 581.927198] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52442337-c94b-fbed-4ed8-bda8f62ae4fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 582.428589] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.430233] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.430233] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.004904] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Successfully created port: feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 583.247503] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Successfully updated port: 7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 583.268890] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.269135] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquired lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 583.269221] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 583.471263] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 583.653640] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "14af3749-f031-4543-96e4-af0b4fd28e2b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.653873] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 584.648405] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Updating instance_info_cache with network_info: [{"id": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "address": "fa:16:3e:ae:d1:77", "network": {"id": "45f8649f-a424-409e-8030-e5b80e3b4714", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-957537048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8d0bf43012c42e1902c054df4ea4e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7ffd7eea-2a", "ovs_interfaceid": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.663155] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Releasing lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 584.663155] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance network_info: |[{"id": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "address": "fa:16:3e:ae:d1:77", "network": {"id": "45f8649f-a424-409e-8030-e5b80e3b4714", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-957537048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8d0bf43012c42e1902c054df4ea4e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7ffd7eea-2a", "ovs_interfaceid": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 584.663287] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ae:d1:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3ac3fd84-c373-49f5-82dc-784a6cdb686d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 584.672116] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Creating folder: Project (a8d0bf43012c42e1902c054df4ea4e1f). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.675025] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c453e978-0a80-4b43-ba61-e32dd2ec80d7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.686812] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Created folder: Project (a8d0bf43012c42e1902c054df4ea4e1f) in parent group-v677434. [ 584.687019] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Creating folder: Instances. Parent ref: group-v677459. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.687255] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-20232daa-9205-479a-9676-4211df06da1a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.697024] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Created folder: Instances in parent group-v677459. [ 584.697275] env[68492]: DEBUG oslo.service.loopingcall [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 584.697528] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 584.697735] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-827aa8ed-e256-4edf-a3b9-f1f046582291 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.726132] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 584.726132] env[68492]: value = "task-3395344" [ 584.726132] env[68492]: _type = "Task" [ 584.726132] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 584.735035] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395344, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.184304] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Received event network-changed-598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 585.184304] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Refreshing instance network info cache due to event network-changed-598168e3-27c1-4bd9-8974-c1829fcd2bb0. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 585.184639] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Acquiring lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.184639] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Acquired lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.184962] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Refreshing network info cache for port 598168e3-27c1-4bd9-8974-c1829fcd2bb0 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 585.238862] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395344, 'name': CreateVM_Task, 'duration_secs': 0.345286} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 585.239134] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 585.240306] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.240486] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.241091] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 585.241505] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52aafb02-7352-42dd-8803-68cf712cb4c9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 585.247707] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for the task: (returnval){ [ 585.247707] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52cb92b2-cc19-a9a7-4d06-e210580d19e9" [ 585.247707] env[68492]: _type = "Task" [ 585.247707] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.257526] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52cb92b2-cc19-a9a7-4d06-e210580d19e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.417020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.417323] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.615326] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Successfully updated port: feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 585.631020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.631020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.631020] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 585.729028] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 585.768384] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.768384] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 585.768384] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.225423] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Updating instance_info_cache with network_info: [{"id": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "address": "fa:16:3e:c5:5a:fe", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.141", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeaa8b45-f9", "ovs_interfaceid": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.245432] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.245432] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance network_info: |[{"id": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "address": "fa:16:3e:c5:5a:fe", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.141", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeaa8b45-f9", "ovs_interfaceid": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 586.245551] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c5:5a:fe', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'feaa8b45-f990-433f-aa2d-d0b5d7fc4c57', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 586.252288] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating folder: Project (d3d54feaed07492da952b05c788f99f9). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 586.253200] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aa257cce-fed4-4652-a840-c8a8d15a2abe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.265397] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created folder: Project (d3d54feaed07492da952b05c788f99f9) in parent group-v677434. [ 586.265899] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating folder: Instances. Parent ref: group-v677462. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 586.266284] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5ec5715d-29d2-487b-a733-eb3c8c71b401 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.275090] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created folder: Instances in parent group-v677462. [ 586.275618] env[68492]: DEBUG oslo.service.loopingcall [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 586.275996] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 586.278110] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6c69f103-9430-4dbf-8c68-f9e1fd20d29d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.297526] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 586.297526] env[68492]: value = "task-3395347" [ 586.297526] env[68492]: _type = "Task" [ 586.297526] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.309162] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395347, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.449814] env[68492]: DEBUG nova.compute.manager [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Received event network-vif-plugged-7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 586.449814] env[68492]: DEBUG oslo_concurrency.lockutils [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] Acquiring lock "12450355-d90e-40dc-b66f-6105ec320d19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.449814] env[68492]: DEBUG oslo_concurrency.lockutils [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] Lock "12450355-d90e-40dc-b66f-6105ec320d19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.450550] env[68492]: DEBUG oslo_concurrency.lockutils [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] Lock "12450355-d90e-40dc-b66f-6105ec320d19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 586.450962] env[68492]: DEBUG nova.compute.manager [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] No waiting events found dispatching network-vif-plugged-7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 586.451343] env[68492]: WARNING nova.compute.manager [req-fbe322fe-c6dd-4b5c-8e8d-b025a49a0a0e req-613fc6ab-69e9-4a15-9e47-6f5fceb206d8 service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Received unexpected event network-vif-plugged-7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 for instance with vm_state building and task_state spawning. [ 586.811948] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395347, 'name': CreateVM_Task, 'duration_secs': 0.338206} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 586.811948] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 586.813277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.813277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.813277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 586.814079] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-581bc57d-e4e4-4f43-a0c2-b0f61cbe5cc5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 586.821040] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 586.821040] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c05d82-3b33-6ef8-0eea-9a7ea2d040dc" [ 586.821040] env[68492]: _type = "Task" [ 586.821040] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 586.829697] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c05d82-3b33-6ef8-0eea-9a7ea2d040dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 586.955197] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Updated VIF entry in instance network info cache for port 598168e3-27c1-4bd9-8974-c1829fcd2bb0. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 586.955539] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Updating instance_info_cache with network_info: [{"id": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "address": "fa:16:3e:ce:60:6a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap598168e3-27", "ovs_interfaceid": "598168e3-27c1-4bd9-8974-c1829fcd2bb0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.967372] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Releasing lock "refresh_cache-f3c94673-a8fc-4ead-9907-4347cd6244ba" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.967687] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Received event network-vif-plugged-0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 586.967879] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Acquiring lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 586.968088] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 586.968280] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 586.969315] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] No waiting events found dispatching network-vif-plugged-0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 586.969315] env[68492]: WARNING nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Received unexpected event network-vif-plugged-0d91df47-3bcf-41ae-abc6-b32f973c86a6 for instance with vm_state building and task_state spawning. [ 586.969315] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Received event network-changed-0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 586.969315] env[68492]: DEBUG nova.compute.manager [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Refreshing instance network info cache due to event network-changed-0d91df47-3bcf-41ae-abc6-b32f973c86a6. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 586.969315] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Acquiring lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 586.969522] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Acquired lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 586.969522] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Refreshing network info cache for port 0d91df47-3bcf-41ae-abc6-b32f973c86a6 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 587.335187] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 587.335448] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 587.335661] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 587.793174] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 587.795015] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.828285] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Updated VIF entry in instance network info cache for port 0d91df47-3bcf-41ae-abc6-b32f973c86a6. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 588.828761] env[68492]: DEBUG nova.network.neutron [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Updating instance_info_cache with network_info: [{"id": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "address": "fa:16:3e:50:2b:32", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.62", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d91df47-3b", "ovs_interfaceid": "0d91df47-3bcf-41ae-abc6-b32f973c86a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 588.842643] env[68492]: DEBUG oslo_concurrency.lockutils [req-b2d37ea3-34af-4769-b3d4-4d983a901713 req-853684d8-9d33-4af5-ae35-5111e1331011 service nova] Releasing lock "refresh_cache-b7e0d1c7-d21b-42c1-b400-86be946df689" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 590.518046] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.518325] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.291058] env[68492]: DEBUG nova.compute.manager [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Received event network-vif-plugged-feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 591.291058] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Acquiring lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.291058] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.291058] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 591.291406] env[68492]: DEBUG nova.compute.manager [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] No waiting events found dispatching network-vif-plugged-feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 591.291406] env[68492]: WARNING nova.compute.manager [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Received unexpected event network-vif-plugged-feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 for instance with vm_state building and task_state spawning. [ 591.292354] env[68492]: DEBUG nova.compute.manager [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Received event network-changed-feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 591.292809] env[68492]: DEBUG nova.compute.manager [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Refreshing instance network info cache due to event network-changed-feaa8b45-f990-433f-aa2d-d0b5d7fc4c57. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 591.293624] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Acquiring lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 591.293951] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Acquired lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 591.294305] env[68492]: DEBUG nova.network.neutron [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Refreshing network info cache for port feaa8b45-f990-433f-aa2d-d0b5d7fc4c57 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 592.059204] env[68492]: DEBUG nova.compute.manager [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Received event network-changed-7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 592.059204] env[68492]: DEBUG nova.compute.manager [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Refreshing instance network info cache due to event network-changed-7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 592.059204] env[68492]: DEBUG oslo_concurrency.lockutils [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] Acquiring lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 592.059204] env[68492]: DEBUG oslo_concurrency.lockutils [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] Acquired lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 592.059534] env[68492]: DEBUG nova.network.neutron [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Refreshing network info cache for port 7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 592.598577] env[68492]: DEBUG nova.network.neutron [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Updated VIF entry in instance network info cache for port feaa8b45-f990-433f-aa2d-d0b5d7fc4c57. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 592.599034] env[68492]: DEBUG nova.network.neutron [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Updating instance_info_cache with network_info: [{"id": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "address": "fa:16:3e:c5:5a:fe", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.141", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfeaa8b45-f9", "ovs_interfaceid": "feaa8b45-f990-433f-aa2d-d0b5d7fc4c57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.618615] env[68492]: DEBUG oslo_concurrency.lockutils [req-5c9ec30f-7157-4382-a656-6357a3ea55c8 req-5c50e306-a80b-43a2-9bd7-fb4071b3131a service nova] Releasing lock "refresh_cache-acbc1e36-0803-44ff-8ebc-094083193bc4" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 592.973846] env[68492]: DEBUG nova.network.neutron [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Updated VIF entry in instance network info cache for port 7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 592.974469] env[68492]: DEBUG nova.network.neutron [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Updating instance_info_cache with network_info: [{"id": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "address": "fa:16:3e:ae:d1:77", "network": {"id": "45f8649f-a424-409e-8030-e5b80e3b4714", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-957537048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a8d0bf43012c42e1902c054df4ea4e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7ffd7eea-2a", "ovs_interfaceid": "7ffd7eea-2aa9-4cca-ab84-d0fe4d11cf57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.988624] env[68492]: DEBUG oslo_concurrency.lockutils [req-67e21c96-eca7-4cfa-9e24-1ebec0dbccdb req-22c3339d-8569-499c-90c9-ee9edf3b8ccf service nova] Releasing lock "refresh_cache-12450355-d90e-40dc-b66f-6105ec320d19" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.023860] env[68492]: DEBUG oslo_concurrency.lockutils [None req-66c017ed-8ec9-4027-92c3-9c61b16862de tempest-InstanceActionsV221TestJSON-723775731 tempest-InstanceActionsV221TestJSON-723775731-project-member] Acquiring lock "26967217-559c-4987-ba55-6eb1ff782b24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.024188] env[68492]: DEBUG oslo_concurrency.lockutils [None req-66c017ed-8ec9-4027-92c3-9c61b16862de tempest-InstanceActionsV221TestJSON-723775731 tempest-InstanceActionsV221TestJSON-723775731-project-member] Lock "26967217-559c-4987-ba55-6eb1ff782b24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.740791] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Acquiring lock "e8f36d0a-e116-4bc4-91a4-a6c463a6c373" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.740791] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "e8f36d0a-e116-4bc4-91a4-a6c463a6c373" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.766776] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Acquiring lock "f71b71d9-18c5-4715-ad3b-9d7ac2063d31" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.767617] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "f71b71d9-18c5-4715-ad3b-9d7ac2063d31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.796944] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Acquiring lock "f73c13d0-db0e-4a74-9ece-62f364bf8383" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.796944] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "f73c13d0-db0e-4a74-9ece-62f364bf8383" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 596.373857] env[68492]: DEBUG oslo_concurrency.lockutils [None req-541f78a2-d337-4fdd-b8c4-42d37871c3e7 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 596.374196] env[68492]: DEBUG oslo_concurrency.lockutils [None req-541f78a2-d337-4fdd-b8c4-42d37871c3e7 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 597.980689] env[68492]: DEBUG oslo_concurrency.lockutils [None req-add155ab-b916-4fcf-9c47-10a8b210eec6 tempest-ServersNegativeTestJSON-1148478936 tempest-ServersNegativeTestJSON-1148478936-project-member] Acquiring lock "431adf1d-c988-4832-96c1-6d7ae8de0745" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 597.980689] env[68492]: DEBUG oslo_concurrency.lockutils [None req-add155ab-b916-4fcf-9c47-10a8b210eec6 tempest-ServersNegativeTestJSON-1148478936 tempest-ServersNegativeTestJSON-1148478936-project-member] Lock "431adf1d-c988-4832-96c1-6d7ae8de0745" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.314022] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70544937-df53-4e6d-bb4e-2c2e455cc650 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "20538544-eb9b-4f0e-a49e-120fc721f651" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.315055] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70544937-df53-4e6d-bb4e-2c2e455cc650 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "20538544-eb9b-4f0e-a49e-120fc721f651" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 600.010263] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a226b7a1-69ef-4c35-9f03-0504fb3f179f tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Acquiring lock "ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 600.010482] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a226b7a1-69ef-4c35-9f03-0504fb3f179f tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Lock "ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.512840] env[68492]: DEBUG oslo_concurrency.lockutils [None req-18bee2f4-316a-4c17-8fe2-bc3722cc6928 tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Acquiring lock "d947bb3a-3877-4628-9b83-8d380b47261d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.513401] env[68492]: DEBUG oslo_concurrency.lockutils [None req-18bee2f4-316a-4c17-8fe2-bc3722cc6928 tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Lock "d947bb3a-3877-4628-9b83-8d380b47261d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.684268] env[68492]: WARNING oslo_vmware.rw_handles [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 602.684268] env[68492]: ERROR oslo_vmware.rw_handles [ 602.684856] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 602.685884] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 602.686581] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Copying Virtual Disk [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/9b9fbec1-864b-4be3-aa8b-8f183733ab53/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 602.686912] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6f53bbf7-e01b-4d6e-a8c0-576df9913121 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.697732] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for the task: (returnval){ [ 602.697732] env[68492]: value = "task-3395348" [ 602.697732] env[68492]: _type = "Task" [ 602.697732] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.708458] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Task: {'id': task-3395348, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.742689] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b5df45b9-b527-4dbc-abca-981cf8bb032a tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "1509151e-59a9-41b2-ad52-22a5d888bd5d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.742998] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b5df45b9-b527-4dbc-abca-981cf8bb032a tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "1509151e-59a9-41b2-ad52-22a5d888bd5d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.211747] env[68492]: DEBUG oslo_vmware.exceptions [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 603.211747] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 603.214620] env[68492]: ERROR nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.214620] env[68492]: Faults: ['InvalidArgument'] [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Traceback (most recent call last): [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] yield resources [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self.driver.spawn(context, instance, image_meta, [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self._fetch_image_if_missing(context, vi) [ 603.214620] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] image_cache(vi, tmp_image_ds_loc) [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] vm_util.copy_virtual_disk( [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] session._wait_for_task(vmdk_copy_task) [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return self.wait_for_task(task_ref) [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return evt.wait() [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] result = hub.switch() [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 603.215055] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return self.greenlet.switch() [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self.f(*self.args, **self.kw) [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] raise exceptions.translate_fault(task_info.error) [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Faults: ['InvalidArgument'] [ 603.215404] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] [ 603.215404] env[68492]: INFO nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Terminating instance [ 603.216382] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.216382] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 603.216871] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.217356] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquired lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.217796] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 603.218771] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52674f64-e2b7-42a5-a8a0-88c6cd8916a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.236950] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 603.238030] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 603.238274] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-441cc347-4c22-4aa1-881b-b91dd9b57ab6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.244873] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Waiting for the task: (returnval){ [ 603.244873] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52cddaec-d7da-9843-f349-dc217775f3e9" [ 603.244873] env[68492]: _type = "Task" [ 603.244873] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 603.254612] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52cddaec-d7da-9843-f349-dc217775f3e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 603.277456] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 603.561266] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.572335] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Releasing lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 603.573076] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 603.573076] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 603.575743] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38a1a204-889b-4b5c-8665-c847e5a9d11b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.592690] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 603.593042] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0360351b-9648-4c6f-acf9-b528e2b3b034 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.635083] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 603.635323] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 603.635503] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Deleting the datastore file [datastore2] d1d77916-2250-4bce-a3c1-50a2dda3627f {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 603.635780] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-86028085-d4d2-4d89-a93a-7908dddadcf9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.647053] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for the task: (returnval){ [ 603.647053] env[68492]: value = "task-3395350" [ 603.647053] env[68492]: _type = "Task" [ 603.647053] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 603.660262] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Task: {'id': task-3395350, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 603.760872] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 603.761207] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Creating directory with path [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 603.764192] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-211257de-37f7-412e-8bbb-e96b491b8191 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.774940] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Created directory with path [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 603.776099] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Fetch image to [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 603.776099] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 603.777655] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b499b035-f2fd-47bc-9c33-08e44a55134a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.790838] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf42f63c-09e7-4bf6-bab2-aa5af9bdc87c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.799251] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e6d07eb-d280-4c34-a382-b10291e3765d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.836922] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65a41d88-c5c0-4403-bdbd-d317124e84a4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.843429] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c8d3cf8b-1684-49e2-b981-15f79f789ba5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.930970] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 604.009750] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 604.079030] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 604.079030] env[68492]: DEBUG oslo_vmware.rw_handles [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 604.156662] env[68492]: DEBUG oslo_vmware.api [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Task: {'id': task-3395350, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035006} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 604.156898] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 604.157255] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 604.157419] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 604.159085] env[68492]: INFO nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Took 0.58 seconds to destroy the instance on the hypervisor. [ 604.159085] env[68492]: DEBUG oslo.service.loopingcall [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 604.159085] env[68492]: DEBUG nova.compute.manager [-] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 604.160848] env[68492]: DEBUG nova.compute.claims [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 604.161028] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 604.161243] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 604.588078] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a23dc1-2008-41f6-a42d-d33d8ae42faf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.597038] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d9c1f7e-b0ce-4cad-8729-61cd5592f01d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.629496] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f205c3e-0b58-4f5c-88e9-e330052dde5e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.636817] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-831d0eb1-e8a2-4a11-aef7-25e77d602dab {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.650929] env[68492]: DEBUG nova.compute.provider_tree [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 604.671043] env[68492]: DEBUG nova.scheduler.client.report [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 604.693682] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.532s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 604.694281] env[68492]: ERROR nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 604.694281] env[68492]: Faults: ['InvalidArgument'] [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Traceback (most recent call last): [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self.driver.spawn(context, instance, image_meta, [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self._fetch_image_if_missing(context, vi) [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] image_cache(vi, tmp_image_ds_loc) [ 604.694281] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] vm_util.copy_virtual_disk( [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] session._wait_for_task(vmdk_copy_task) [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return self.wait_for_task(task_ref) [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return evt.wait() [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] result = hub.switch() [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] return self.greenlet.switch() [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 604.694809] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] self.f(*self.args, **self.kw) [ 604.695239] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 604.695239] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] raise exceptions.translate_fault(task_info.error) [ 604.695239] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 604.695239] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Faults: ['InvalidArgument'] [ 604.695239] env[68492]: ERROR nova.compute.manager [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] [ 604.695534] env[68492]: DEBUG nova.compute.utils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 604.698325] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Build of instance d1d77916-2250-4bce-a3c1-50a2dda3627f was re-scheduled: A specified parameter was not correct: fileType [ 604.698325] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 604.698723] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 604.698955] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquiring lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 604.699205] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Acquired lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 604.699372] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 604.974354] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 605.284163] env[68492]: DEBUG nova.network.neutron [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.295093] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Releasing lock "refresh_cache-d1d77916-2250-4bce-a3c1-50a2dda3627f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 605.295342] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 605.295523] env[68492]: DEBUG nova.compute.manager [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 605.420296] env[68492]: INFO nova.scheduler.client.report [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Deleted allocations for instance d1d77916-2250-4bce-a3c1-50a2dda3627f [ 605.457106] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fee25645-3b9c-442e-9114-d685ae6fa862 tempest-ServerShowV257Test-2027026142 tempest-ServerShowV257Test-2027026142-project-member] Lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.125s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 605.457640] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 45.225s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.457838] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: d1d77916-2250-4bce-a3c1-50a2dda3627f] During sync_power_state the instance has a pending task (spawning). Skip. [ 605.461610] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "d1d77916-2250-4bce-a3c1-50a2dda3627f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 605.489480] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 605.586230] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 605.586230] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.586925] env[68492]: INFO nova.compute.claims [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 606.164225] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64cfef55-a730-4582-b583-c93e7a7f6592 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.173660] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-889627e5-dd24-42fe-9cbd-53f6b80f66af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.214755] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d700f71-e54a-4e3a-9327-0ab682971429 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.228742] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf67d169-9eef-4ec8-8443-cbc8a60eb213 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.243598] env[68492]: DEBUG nova.compute.provider_tree [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.255124] env[68492]: DEBUG nova.scheduler.client.report [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.272401] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.687s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 606.274665] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 606.331210] env[68492]: DEBUG nova.compute.utils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 606.333278] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 606.333446] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 606.346359] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 606.434141] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 606.466017] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 606.466273] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 606.466428] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 606.466638] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 606.466802] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 606.466948] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 606.467164] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 606.467327] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 606.467491] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 606.467674] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 606.467811] env[68492]: DEBUG nova.virt.hardware [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 606.468717] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdc2ca29-c476-4a51-9117-c23f639ed808 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.478496] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2048a39-a080-498a-9b37-ef08c5b60b75 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 606.728339] env[68492]: DEBUG nova.policy [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7938ef239a0c4ae29febdd7ecf1cde37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3d54feaed07492da952b05c788f99f9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 607.166134] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Successfully created port: f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 608.157744] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Successfully updated port: f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.190120] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.190120] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 608.190120] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 608.232384] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 608.440330] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Updating instance_info_cache with network_info: [{"id": "f8a8ff32-e904-4947-ba67-c788b6718d36", "address": "fa:16:3e:94:d4:0b", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8a8ff32-e9", "ovs_interfaceid": "f8a8ff32-e904-4947-ba67-c788b6718d36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.462930] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 608.463318] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance network_info: |[{"id": "f8a8ff32-e904-4947-ba67-c788b6718d36", "address": "fa:16:3e:94:d4:0b", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8a8ff32-e9", "ovs_interfaceid": "f8a8ff32-e904-4947-ba67-c788b6718d36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 608.463972] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:d4:0b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f8a8ff32-e904-4947-ba67-c788b6718d36', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 608.477312] env[68492]: DEBUG oslo.service.loopingcall [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 608.479848] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 608.479848] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-878825fb-e25a-4d10-8728-f9735f831382 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 608.501685] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 608.501685] env[68492]: value = "task-3395351" [ 608.501685] env[68492]: _type = "Task" [ 608.501685] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 608.512659] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395351, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 609.012794] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395351, 'name': CreateVM_Task, 'duration_secs': 0.372268} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 609.012973] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 609.013808] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 609.014126] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 609.014608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 609.014976] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a75ae21-787d-403f-bb73-2fc17181e0f2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.021442] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 609.021442] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520a8fec-4052-bc85-0037-6ec12aed6857" [ 609.021442] env[68492]: _type = "Task" [ 609.021442] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 609.031914] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520a8fec-4052-bc85-0037-6ec12aed6857, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 609.503130] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.503578] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.520888] env[68492]: DEBUG nova.compute.manager [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Received event network-vif-plugged-f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 609.520987] env[68492]: DEBUG oslo_concurrency.lockutils [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] Acquiring lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.521206] env[68492]: DEBUG oslo_concurrency.lockutils [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.521404] env[68492]: DEBUG oslo_concurrency.lockutils [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 609.521551] env[68492]: DEBUG nova.compute.manager [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] No waiting events found dispatching network-vif-plugged-f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.521829] env[68492]: WARNING nova.compute.manager [req-2f9fa3e1-0c2f-4c1e-8f5e-5c72234f3b8b req-b96e77e3-0f05-4871-82a1-b58ee54a2bb9 service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Received unexpected event network-vif-plugged-f8a8ff32-e904-4947-ba67-c788b6718d36 for instance with vm_state building and task_state spawning. [ 609.534625] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 609.534857] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 609.535085] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.756251] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fd2470d5-2181-48dc-bdf6-3debc140039a tempest-ServerDiagnosticsV248Test-663931398 tempest-ServerDiagnosticsV248Test-663931398-project-member] Acquiring lock "aae38f8c-fe29-478b-946a-1f75bb9434a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.756700] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fd2470d5-2181-48dc-bdf6-3debc140039a tempest-ServerDiagnosticsV248Test-663931398 tempest-ServerDiagnosticsV248Test-663931398-project-member] Lock "aae38f8c-fe29-478b-946a-1f75bb9434a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.297179] env[68492]: DEBUG nova.compute.manager [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Received event network-changed-f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 613.297179] env[68492]: DEBUG nova.compute.manager [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Refreshing instance network info cache due to event network-changed-f8a8ff32-e904-4947-ba67-c788b6718d36. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 613.297179] env[68492]: DEBUG oslo_concurrency.lockutils [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] Acquiring lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 613.297179] env[68492]: DEBUG oslo_concurrency.lockutils [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] Acquired lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 613.297179] env[68492]: DEBUG nova.network.neutron [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Refreshing network info cache for port f8a8ff32-e904-4947-ba67-c788b6718d36 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 613.627044] env[68492]: DEBUG nova.network.neutron [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Updated VIF entry in instance network info cache for port f8a8ff32-e904-4947-ba67-c788b6718d36. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 613.627044] env[68492]: DEBUG nova.network.neutron [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Updating instance_info_cache with network_info: [{"id": "f8a8ff32-e904-4947-ba67-c788b6718d36", "address": "fa:16:3e:94:d4:0b", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8a8ff32-e9", "ovs_interfaceid": "f8a8ff32-e904-4947-ba67-c788b6718d36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.643061] env[68492]: DEBUG oslo_concurrency.lockutils [req-51dba57c-5897-4af5-ac1e-fe5301c31ba7 req-921188b8-f087-4571-a6af-a9ed313a40cc service nova] Releasing lock "refresh_cache-14af3749-f031-4543-96e4-af0b4fd28e2b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 616.067494] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c636232-a89a-47dc-9e02-e4820174d228 tempest-ServerAddressesTestJSON-565573396 tempest-ServerAddressesTestJSON-565573396-project-member] Acquiring lock "e410e6fa-7652-45d1-8ec1-f1c1db5c728f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.067820] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c636232-a89a-47dc-9e02-e4820174d228 tempest-ServerAddressesTestJSON-565573396 tempest-ServerAddressesTestJSON-565573396-project-member] Lock "e410e6fa-7652-45d1-8ec1-f1c1db5c728f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 619.702054] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cf9c310f-172d-4e95-b4b4-607f3caf131b tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Acquiring lock "e7c66cb6-10fc-44d4-9821-6e3141e04024" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 619.702368] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cf9c310f-172d-4e95-b4b4-607f3caf131b tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "e7c66cb6-10fc-44d4-9821-6e3141e04024" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.784931] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fe4e49e-cce3-469c-b74c-6e44c83ce18c tempest-ImagesNegativeTestJSON-1217222349 tempest-ImagesNegativeTestJSON-1217222349-project-member] Acquiring lock "31f0fab8-123f-4857-93a7-517ac44dbf9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.789079] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fe4e49e-cce3-469c-b74c-6e44c83ce18c tempest-ImagesNegativeTestJSON-1217222349 tempest-ImagesNegativeTestJSON-1217222349-project-member] Lock "31f0fab8-123f-4857-93a7-517ac44dbf9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 628.484266] env[68492]: DEBUG oslo_concurrency.lockutils [None req-49587811-2a76-4767-8140-91ad086366cc tempest-FloatingIPsAssociationNegativeTestJSON-1547245369 tempest-FloatingIPsAssociationNegativeTestJSON-1547245369-project-member] Acquiring lock "d720fc20-a7a6-4826-9174-2fb12bb0a6c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 628.484565] env[68492]: DEBUG oslo_concurrency.lockutils [None req-49587811-2a76-4767-8140-91ad086366cc tempest-FloatingIPsAssociationNegativeTestJSON-1547245369 tempest-FloatingIPsAssociationNegativeTestJSON-1547245369-project-member] Lock "d720fc20-a7a6-4826-9174-2fb12bb0a6c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 629.938256] env[68492]: DEBUG oslo_concurrency.lockutils [None req-083cddbc-c6cc-4246-bd73-59984fcd3343 tempest-FloatingIPsAssociationTestJSON-485227705 tempest-FloatingIPsAssociationTestJSON-485227705-project-member] Acquiring lock "2590f6bd-a48f-49ad-b955-a0ebec9d31e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 629.939593] env[68492]: DEBUG oslo_concurrency.lockutils [None req-083cddbc-c6cc-4246-bd73-59984fcd3343 tempest-FloatingIPsAssociationTestJSON-485227705 tempest-FloatingIPsAssociationTestJSON-485227705-project-member] Lock "2590f6bd-a48f-49ad-b955-a0ebec9d31e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.180083] env[68492]: DEBUG oslo_concurrency.lockutils [None req-53f0b282-5d2d-4456-82ee-fca2cd7c3ca8 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Acquiring lock "9d15dfea-323f-4007-91cb-0a0b64d60a5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.180395] env[68492]: DEBUG oslo_concurrency.lockutils [None req-53f0b282-5d2d-4456-82ee-fca2cd7c3ca8 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Lock "9d15dfea-323f-4007-91cb-0a0b64d60a5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.690033] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 631.724720] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.231179] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.231464] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.231570] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 632.231683] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 632.263283] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.263633] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.263633] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.263753] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.263806] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.263931] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.264268] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.264450] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.264717] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.264717] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 632.264809] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 632.266424] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.266617] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.266769] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.266924] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.267096] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 633.230837] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 633.231132] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 633.245598] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.245844] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.245972] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 633.246129] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 633.250230] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e94e3963-f6f1-4cd4-b431-490446591ab0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.256911] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0cdcf25-824b-4f0e-9d21-7b153873d8bd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.270821] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95b6c385-78ac-44da-9076-477f8fd3fcc4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.279795] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a46d7b67-de4f-4da5-90b0-d434e8b105d8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.312489] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180959MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 633.312489] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.312489] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.417910] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b1180e4b-9e82-42e3-867c-b4a757ca6f14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.418033] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 02050238-c4a5-4c06-952d-06af14ff7d35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.418143] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3de34725-4b54-4956-b2b6-285c9138e94c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.418269] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e9f787fc-98be-4086-9b70-ebbf33e31d13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419375] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419375] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f3c94673-a8fc-4ead-9907-4347cd6244ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419375] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419375] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419623] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.419623] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 633.455394] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.482649] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.494188] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.505613] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 26967217-559c-4987-ba55-6eb1ff782b24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.516642] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e8f36d0a-e116-4bc4-91a4-a6c463a6c373 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.528410] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f71b71d9-18c5-4715-ad3b-9d7ac2063d31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.541591] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f73c13d0-db0e-4a74-9ece-62f364bf8383 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.551359] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.565685] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 431adf1d-c988-4832-96c1-6d7ae8de0745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.580464] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 20538544-eb9b-4f0e-a49e-120fc721f651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.591713] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.601391] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d947bb3a-3877-4628-9b83-8d380b47261d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.615981] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1509151e-59a9-41b2-ad52-22a5d888bd5d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.632626] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.645149] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aae38f8c-fe29-478b-946a-1f75bb9434a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.658416] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e410e6fa-7652-45d1-8ec1-f1c1db5c728f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.670123] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e7c66cb6-10fc-44d4-9821-6e3141e04024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.686850] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 31f0fab8-123f-4857-93a7-517ac44dbf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.700033] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d720fc20-a7a6-4826-9174-2fb12bb0a6c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.716519] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2590f6bd-a48f-49ad-b955-a0ebec9d31e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.728115] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9d15dfea-323f-4007-91cb-0a0b64d60a5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 633.728557] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 633.728557] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 634.211602] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7633f93-5f7f-4170-9fc3-a94359f9ce20 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.220419] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225bb28d-758c-49ac-b5c4-8991a6ae7549 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.256771] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdaa5f16-2eff-4465-90ba-71d49d1c2a2e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.268489] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b2cc748-38fd-4f67-af79-b90648f592b9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.284354] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.293792] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.309921] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 634.310136] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.998s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 639.306200] env[68492]: DEBUG oslo_concurrency.lockutils [None req-513f315a-28ad-46ef-b482-909fc804883e tempest-AttachInterfacesV270Test-472283853 tempest-AttachInterfacesV270Test-472283853-project-member] Acquiring lock "81d59156-2869-4045-a2d3-349e6077f477" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.306860] env[68492]: DEBUG oslo_concurrency.lockutils [None req-513f315a-28ad-46ef-b482-909fc804883e tempest-AttachInterfacesV270Test-472283853 tempest-AttachInterfacesV270Test-472283853-project-member] Lock "81d59156-2869-4045-a2d3-349e6077f477" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 642.023134] env[68492]: DEBUG oslo_concurrency.lockutils [None req-3c0ebeb4-da4b-4dca-9428-37df29488a3e tempest-ServerGroupTestJSON-859793356 tempest-ServerGroupTestJSON-859793356-project-member] Acquiring lock "1ee59a29-0ef7-4906-a027-90992418c3fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 642.023446] env[68492]: DEBUG oslo_concurrency.lockutils [None req-3c0ebeb4-da4b-4dca-9428-37df29488a3e tempest-ServerGroupTestJSON-859793356 tempest-ServerGroupTestJSON-859793356-project-member] Lock "1ee59a29-0ef7-4906-a027-90992418c3fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.349499] env[68492]: WARNING oslo_vmware.rw_handles [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 650.349499] env[68492]: ERROR oslo_vmware.rw_handles [ 650.350126] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 650.351400] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 650.351699] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Copying Virtual Disk [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/2672aec8-a94e-4836-b554-94560adcbd4d/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 650.352010] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4a96333c-2435-4abd-b59b-85953bede36a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.361555] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Waiting for the task: (returnval){ [ 650.361555] env[68492]: value = "task-3395352" [ 650.361555] env[68492]: _type = "Task" [ 650.361555] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.369598] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Task: {'id': task-3395352, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.873016] env[68492]: DEBUG oslo_vmware.exceptions [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 650.873016] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.873554] env[68492]: ERROR nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 650.873554] env[68492]: Faults: ['InvalidArgument'] [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Traceback (most recent call last): [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] yield resources [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self.driver.spawn(context, instance, image_meta, [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self._fetch_image_if_missing(context, vi) [ 650.873554] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] image_cache(vi, tmp_image_ds_loc) [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] vm_util.copy_virtual_disk( [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] session._wait_for_task(vmdk_copy_task) [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return self.wait_for_task(task_ref) [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return evt.wait() [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] result = hub.switch() [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 650.873850] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return self.greenlet.switch() [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self.f(*self.args, **self.kw) [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] raise exceptions.translate_fault(task_info.error) [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Faults: ['InvalidArgument'] [ 650.874164] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] [ 650.874164] env[68492]: INFO nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Terminating instance [ 650.875357] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.875558] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 650.875797] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e98d12b-f84a-44d4-9832-4641a3ebb6bd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.879103] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 650.879375] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 650.880164] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-679ed340-6c3a-492f-aa86-cd1a6d8b6437 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.883911] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 650.884096] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 650.885087] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ba96023f-9e20-4afb-9988-80aaaa4f7f9c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.889046] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 650.889512] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-43546f09-f3e5-4b66-b0db-30bb363e1b08 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.891786] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for the task: (returnval){ [ 650.891786] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528a5fea-b1d2-4639-4818-83fd2de63743" [ 650.891786] env[68492]: _type = "Task" [ 650.891786] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.899286] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528a5fea-b1d2-4639-4818-83fd2de63743, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.958654] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 650.958878] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 650.959102] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Deleting the datastore file [datastore2] b1180e4b-9e82-42e3-867c-b4a757ca6f14 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 650.959368] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a4988d4e-3795-4c36-b0c4-88608316f8e9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.965144] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Waiting for the task: (returnval){ [ 650.965144] env[68492]: value = "task-3395354" [ 650.965144] env[68492]: _type = "Task" [ 650.965144] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.972870] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Task: {'id': task-3395354, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 651.401692] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 651.401986] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Creating directory with path [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 651.402226] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-635642c7-d76a-40a3-83f7-1224e15570d9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.414564] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Created directory with path [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 651.414768] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Fetch image to [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 651.414936] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 651.415691] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da743b5d-d2ff-40f6-ab09-8d0ed7c4e109 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.422150] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcfa9cc6-84c5-4a90-bc80-7703b82e88e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.431144] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e124fe7-cccc-45ed-a805-f89010ff08ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.461941] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf9474c5-50a5-4131-a25c-4af54c44e875 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.469918] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-258d5807-91e7-4945-b2b2-a1b6c9f1a64d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.474154] env[68492]: DEBUG oslo_vmware.api [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Task: {'id': task-3395354, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066594} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 651.474768] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 651.474956] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 651.475137] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 651.475306] env[68492]: INFO nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Took 0.60 seconds to destroy the instance on the hypervisor. [ 651.477364] env[68492]: DEBUG nova.compute.claims [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 651.477550] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.477761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.494456] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 651.548825] env[68492]: DEBUG oslo_vmware.rw_handles [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 651.608122] env[68492]: DEBUG oslo_vmware.rw_handles [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 651.608322] env[68492]: DEBUG oslo_vmware.rw_handles [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 651.947370] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41619766-2f7d-40c5-b262-cca7ebaaf336 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.955147] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb7c155f-e09c-4a43-beab-8d18817fc192 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.986151] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10af4924-9e39-4f01-8d27-a664b12d5be1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.993285] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a281215a-1759-4900-917c-b44113d1360c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.007377] env[68492]: DEBUG nova.compute.provider_tree [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.015893] env[68492]: DEBUG nova.scheduler.client.report [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.031970] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.554s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.032499] env[68492]: ERROR nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.032499] env[68492]: Faults: ['InvalidArgument'] [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Traceback (most recent call last): [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self.driver.spawn(context, instance, image_meta, [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self._fetch_image_if_missing(context, vi) [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] image_cache(vi, tmp_image_ds_loc) [ 652.032499] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] vm_util.copy_virtual_disk( [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] session._wait_for_task(vmdk_copy_task) [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return self.wait_for_task(task_ref) [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return evt.wait() [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] result = hub.switch() [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] return self.greenlet.switch() [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 652.032818] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] self.f(*self.args, **self.kw) [ 652.033192] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 652.033192] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] raise exceptions.translate_fault(task_info.error) [ 652.033192] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.033192] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Faults: ['InvalidArgument'] [ 652.033192] env[68492]: ERROR nova.compute.manager [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] [ 652.033192] env[68492]: DEBUG nova.compute.utils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 652.034963] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Build of instance b1180e4b-9e82-42e3-867c-b4a757ca6f14 was re-scheduled: A specified parameter was not correct: fileType [ 652.034963] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 652.035386] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 652.035562] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 652.035716] env[68492]: DEBUG nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 652.035874] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 652.566774] env[68492]: DEBUG nova.network.neutron [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 652.579057] env[68492]: INFO nova.compute.manager [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] Took 0.54 seconds to deallocate network for instance. [ 652.699023] env[68492]: INFO nova.scheduler.client.report [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Deleted allocations for instance b1180e4b-9e82-42e3-867c-b4a757ca6f14 [ 652.721119] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fa0b0cc9-8f70-4941-803b-1173bfe1e22d tempest-ServerDiagnosticsTest-641773545 tempest-ServerDiagnosticsTest-641773545-project-member] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 97.330s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.722454] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 92.489s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.723102] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b1180e4b-9e82-42e3-867c-b4a757ca6f14] During sync_power_state the instance has a pending task (spawning). Skip. [ 652.723102] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "b1180e4b-9e82-42e3-867c-b4a757ca6f14" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.749193] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 652.814067] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.814067] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.814067] env[68492]: INFO nova.compute.claims [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 653.308898] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eab47a1c-81a4-4212-b0ff-225cf5e70630 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.316846] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ad8e8e-b2fc-4cfb-907d-9e203513b416 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.347014] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea49f1c-054b-447b-99f8-34af86afb8a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.355722] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7dfa385-ee79-466a-ae2f-d304f275225c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.369881] env[68492]: DEBUG nova.compute.provider_tree [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 653.378521] env[68492]: DEBUG nova.scheduler.client.report [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 653.398711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.586s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.399236] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 653.448667] env[68492]: DEBUG nova.compute.utils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 653.449488] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 653.449612] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 653.460919] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 653.520263] env[68492]: DEBUG nova.policy [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6522ace0342b418ba35f7874c5661808', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c86cb33e9c44fd1a7aa06acc0a66267', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 653.539663] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 653.568113] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 653.568500] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 653.568500] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 653.568597] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 653.569582] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 653.569582] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 653.569582] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 653.569582] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 653.569582] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 653.569762] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 653.569762] env[68492]: DEBUG nova.virt.hardware [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 653.571240] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a17fbb59-c585-48c6-8d6e-2c0429af7608 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.579279] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b06a6d4-46cf-4385-a10b-fc764b15e99a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.878335] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Successfully created port: 7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 654.641723] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Successfully updated port: 7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 654.656244] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.656395] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquired lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.656547] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 654.698689] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 654.871640] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Updating instance_info_cache with network_info: [{"id": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "address": "fa:16:3e:76:17:17", "network": {"id": "3f4140f8-118f-4c44-89ac-1883fc60ea6d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1763220690-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c86cb33e9c44fd1a7aa06acc0a66267", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f7a1f33a-9466-4c83-89f6-fd990f47b1ef", "external-id": "nsx-vlan-transportzone-90", "segmentation_id": 90, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7725c5ef-b5", "ovs_interfaceid": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.888050] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Releasing lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 654.888050] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance network_info: |[{"id": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "address": "fa:16:3e:76:17:17", "network": {"id": "3f4140f8-118f-4c44-89ac-1883fc60ea6d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1763220690-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c86cb33e9c44fd1a7aa06acc0a66267", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f7a1f33a-9466-4c83-89f6-fd990f47b1ef", "external-id": "nsx-vlan-transportzone-90", "segmentation_id": 90, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7725c5ef-b5", "ovs_interfaceid": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 654.888435] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:76:17:17', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f7a1f33a-9466-4c83-89f6-fd990f47b1ef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7725c5ef-b56b-4e37-9db6-6ef811d50a88', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 654.896302] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Creating folder: Project (2c86cb33e9c44fd1a7aa06acc0a66267). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 654.896874] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0798e6e5-8432-4613-a928-c0d6f6551d0a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.907461] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Created folder: Project (2c86cb33e9c44fd1a7aa06acc0a66267) in parent group-v677434. [ 654.907461] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Creating folder: Instances. Parent ref: group-v677466. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 654.907461] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aa930fe5-8b9c-4ec8-9382-b15874de9ce9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.916459] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Created folder: Instances in parent group-v677466. [ 654.916950] env[68492]: DEBUG oslo.service.loopingcall [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 654.916950] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 654.917059] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-48a79fae-9f3b-4d91-9d0e-4832de4d90a0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.935650] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 654.935650] env[68492]: value = "task-3395357" [ 654.935650] env[68492]: _type = "Task" [ 654.935650] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 654.940936] env[68492]: DEBUG nova.compute.manager [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Received event network-vif-plugged-7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 654.941192] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Acquiring lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.941394] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.941554] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.941741] env[68492]: DEBUG nova.compute.manager [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] No waiting events found dispatching network-vif-plugged-7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 654.941869] env[68492]: WARNING nova.compute.manager [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Received unexpected event network-vif-plugged-7725c5ef-b56b-4e37-9db6-6ef811d50a88 for instance with vm_state building and task_state spawning. [ 654.942106] env[68492]: DEBUG nova.compute.manager [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Received event network-changed-7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 654.942220] env[68492]: DEBUG nova.compute.manager [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Refreshing instance network info cache due to event network-changed-7725c5ef-b56b-4e37-9db6-6ef811d50a88. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 654.942557] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Acquiring lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.942557] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Acquired lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.942665] env[68492]: DEBUG nova.network.neutron [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Refreshing network info cache for port 7725c5ef-b56b-4e37-9db6-6ef811d50a88 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 654.948908] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395357, 'name': CreateVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.222406] env[68492]: DEBUG nova.network.neutron [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Updated VIF entry in instance network info cache for port 7725c5ef-b56b-4e37-9db6-6ef811d50a88. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 655.222747] env[68492]: DEBUG nova.network.neutron [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Updating instance_info_cache with network_info: [{"id": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "address": "fa:16:3e:76:17:17", "network": {"id": "3f4140f8-118f-4c44-89ac-1883fc60ea6d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1763220690-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c86cb33e9c44fd1a7aa06acc0a66267", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f7a1f33a-9466-4c83-89f6-fd990f47b1ef", "external-id": "nsx-vlan-transportzone-90", "segmentation_id": 90, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7725c5ef-b5", "ovs_interfaceid": "7725c5ef-b56b-4e37-9db6-6ef811d50a88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.233992] env[68492]: DEBUG oslo_concurrency.lockutils [req-4df64622-be67-42a5-84ea-1ea37bcad533 req-61fa94b0-a2be-4674-99f1-e7c24e38b9d0 service nova] Releasing lock "refresh_cache-4f1ede2c-7ee7-415f-a656-6c792a1b508c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.446810] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395357, 'name': CreateVM_Task, 'duration_secs': 0.299195} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 655.446946] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 655.447846] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.448192] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.448341] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 655.448611] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14ceabbb-d87e-447c-8582-a66658f4c521 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.453064] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for the task: (returnval){ [ 655.453064] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52a191fe-f8a8-4254-69e3-4203ee9ecada" [ 655.453064] env[68492]: _type = "Task" [ 655.453064] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 655.460533] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52a191fe-f8a8-4254-69e3-4203ee9ecada, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.967612] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.967612] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 655.967612] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 659.199606] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.200589] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.310047] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.310322] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.310443] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.310594] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.310744] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.310885] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 693.311046] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.323989] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.324228] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.324397] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 693.324554] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 693.325671] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0afd686-d4bd-4266-bb30-2a16f1e202b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.335911] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-983417ce-499f-47e6-9177-b5ee1392320a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.348818] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75178c39-1a30-4da8-a087-a8399f15b10c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.355333] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e937bfb9-3858-4517-9098-d5fdb52860c0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.389422] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180962MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 693.389596] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.389791] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.467566] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 02050238-c4a5-4c06-952d-06af14ff7d35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.467739] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3de34725-4b54-4956-b2b6-285c9138e94c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.467888] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e9f787fc-98be-4086-9b70-ebbf33e31d13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468029] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468154] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f3c94673-a8fc-4ead-9907-4347cd6244ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468272] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468389] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468506] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468622] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.468737] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 693.479936] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.491371] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.502026] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 26967217-559c-4987-ba55-6eb1ff782b24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.514024] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e8f36d0a-e116-4bc4-91a4-a6c463a6c373 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.524748] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f71b71d9-18c5-4715-ad3b-9d7ac2063d31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.533842] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f73c13d0-db0e-4a74-9ece-62f364bf8383 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.544186] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.553605] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 431adf1d-c988-4832-96c1-6d7ae8de0745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.563918] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 20538544-eb9b-4f0e-a49e-120fc721f651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.573517] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.583519] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d947bb3a-3877-4628-9b83-8d380b47261d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.594537] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1509151e-59a9-41b2-ad52-22a5d888bd5d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.604077] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.615334] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aae38f8c-fe29-478b-946a-1f75bb9434a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.624967] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e410e6fa-7652-45d1-8ec1-f1c1db5c728f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.634309] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e7c66cb6-10fc-44d4-9821-6e3141e04024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.643833] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 31f0fab8-123f-4857-93a7-517ac44dbf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.654697] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d720fc20-a7a6-4826-9174-2fb12bb0a6c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.663889] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2590f6bd-a48f-49ad-b955-a0ebec9d31e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.673726] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9d15dfea-323f-4007-91cb-0a0b64d60a5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.682929] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 81d59156-2869-4045-a2d3-349e6077f477 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.692217] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1ee59a29-0ef7-4906-a027-90992418c3fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.702433] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 693.702507] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 693.703265] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 694.070326] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-067e5fc6-123a-439a-b1ef-494869fce58f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.078282] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a88b0271-e37b-4b4b-8899-f20fd3eaa4e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.107800] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76e6ea0a-6a62-46dd-ada5-443059bfd608 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.115092] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87193be7-b2a0-4cba-a067-6a9e5666ad9d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.128598] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 694.136032] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 694.150103] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 694.150103] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.760s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 695.065619] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.065932] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 695.066060] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 695.066149] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 695.085994] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086176] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086309] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086435] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086559] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086678] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086796] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.086912] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.087038] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.087166] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 695.087286] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 695.087776] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.712794] env[68492]: WARNING oslo_vmware.rw_handles [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 697.712794] env[68492]: ERROR oslo_vmware.rw_handles [ 697.713448] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 697.715060] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 697.715260] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Copying Virtual Disk [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/75a27360-0e01-4a26-b8c2-651009f1870c/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 697.715540] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dd702115-0bbf-4b62-bb0b-51dd045e11c9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.723522] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for the task: (returnval){ [ 697.723522] env[68492]: value = "task-3395358" [ 697.723522] env[68492]: _type = "Task" [ 697.723522] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 697.731527] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Task: {'id': task-3395358, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 698.235022] env[68492]: DEBUG oslo_vmware.exceptions [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 698.235689] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 698.236047] env[68492]: ERROR nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 698.236047] env[68492]: Faults: ['InvalidArgument'] [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Traceback (most recent call last): [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] yield resources [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self.driver.spawn(context, instance, image_meta, [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self._vmops.spawn(context, instance, image_meta, injected_files, [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self._fetch_image_if_missing(context, vi) [ 698.236047] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] image_cache(vi, tmp_image_ds_loc) [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] vm_util.copy_virtual_disk( [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] session._wait_for_task(vmdk_copy_task) [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return self.wait_for_task(task_ref) [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return evt.wait() [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] result = hub.switch() [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 698.236418] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return self.greenlet.switch() [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self.f(*self.args, **self.kw) [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] raise exceptions.translate_fault(task_info.error) [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Faults: ['InvalidArgument'] [ 698.236791] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] [ 698.236791] env[68492]: INFO nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Terminating instance [ 698.237890] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 698.238111] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 698.238355] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b39a1563-d99c-4cdd-ba7d-dd58875f5da0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.240660] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 698.240815] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquired lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 698.240981] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 698.247661] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 698.247837] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 698.249040] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-353545a2-3843-4622-80ab-af3ce73b5301 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.257342] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Waiting for the task: (returnval){ [ 698.257342] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528a296f-bbde-d109-7482-01ba4fa2665a" [ 698.257342] env[68492]: _type = "Task" [ 698.257342] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 698.265134] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528a296f-bbde-d109-7482-01ba4fa2665a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 698.272178] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 698.334816] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.343714] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Releasing lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 698.344120] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 698.344308] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 698.345386] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15736bc6-c389-4bd2-bab8-b0aef01c6e94 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.355446] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 698.355689] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c42687f1-f369-4e51-a920-1bff5bf39f88 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.384967] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 698.384967] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 698.384967] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Deleting the datastore file [datastore2] e9f787fc-98be-4086-9b70-ebbf33e31d13 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 698.385198] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b3eb13f9-4e6c-400d-b675-3b0beb1e6529 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.391059] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for the task: (returnval){ [ 698.391059] env[68492]: value = "task-3395360" [ 698.391059] env[68492]: _type = "Task" [ 698.391059] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 698.399093] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Task: {'id': task-3395360, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 698.768179] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 698.768512] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Creating directory with path [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 698.768613] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ca78ae29-536d-4bf3-8713-7f1154965904 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.779983] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Created directory with path [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 698.780196] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Fetch image to [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 698.780365] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 698.781129] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78330a68-5596-4d26-b8f0-f782428406b8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.787898] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c0f8b5e-51ba-4720-9b18-862051e08357 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.796943] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83d088a3-f3a0-47ed-be6e-24490dd7255e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.827025] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3513199f-fede-42d4-9227-29dd570f5ec2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.832840] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-833e4c14-3ba3-4a8c-a7a0-6dddb441cf75 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.901268] env[68492]: DEBUG oslo_vmware.api [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Task: {'id': task-3395360, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037154} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 698.901544] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 698.901724] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 698.901883] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 698.903260] env[68492]: INFO nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Took 0.56 seconds to destroy the instance on the hypervisor. [ 698.903260] env[68492]: DEBUG oslo.service.loopingcall [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 698.903260] env[68492]: DEBUG nova.compute.manager [-] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 698.904743] env[68492]: DEBUG nova.compute.claims [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 698.904915] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 698.905467] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 698.917955] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 698.991294] env[68492]: DEBUG oslo_vmware.rw_handles [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 699.054531] env[68492]: DEBUG oslo_vmware.rw_handles [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 699.054840] env[68492]: DEBUG oslo_vmware.rw_handles [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 699.398218] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45f3fcc8-ff7b-400a-8218-1fcf44b277c5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.405660] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d1bc2b1-4bd8-4fe0-bc34-2f2806cdf9b0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.435371] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9399d11-aa0f-4172-b6b8-202f82cedeb7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.443103] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a0ee1f-e41c-4a29-b158-a793901d381f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.457135] env[68492]: DEBUG nova.compute.provider_tree [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.467142] env[68492]: DEBUG nova.scheduler.client.report [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.487025] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.582s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 699.487409] env[68492]: ERROR nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 699.487409] env[68492]: Faults: ['InvalidArgument'] [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Traceback (most recent call last): [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self.driver.spawn(context, instance, image_meta, [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self._vmops.spawn(context, instance, image_meta, injected_files, [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self._fetch_image_if_missing(context, vi) [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] image_cache(vi, tmp_image_ds_loc) [ 699.487409] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] vm_util.copy_virtual_disk( [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] session._wait_for_task(vmdk_copy_task) [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return self.wait_for_task(task_ref) [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return evt.wait() [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] result = hub.switch() [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] return self.greenlet.switch() [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 699.487697] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] self.f(*self.args, **self.kw) [ 699.487976] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 699.487976] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] raise exceptions.translate_fault(task_info.error) [ 699.487976] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 699.487976] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Faults: ['InvalidArgument'] [ 699.487976] env[68492]: ERROR nova.compute.manager [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] [ 699.488214] env[68492]: DEBUG nova.compute.utils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 699.493659] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Build of instance e9f787fc-98be-4086-9b70-ebbf33e31d13 was re-scheduled: A specified parameter was not correct: fileType [ 699.493659] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 699.493659] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 699.493659] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquiring lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 699.493659] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Acquired lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 699.493965] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 699.519502] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 699.583759] env[68492]: DEBUG nova.network.neutron [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.592230] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Releasing lock "refresh_cache-e9f787fc-98be-4086-9b70-ebbf33e31d13" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 699.592493] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 699.592729] env[68492]: DEBUG nova.compute.manager [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] [instance: e9f787fc-98be-4086-9b70-ebbf33e31d13] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 699.684300] env[68492]: INFO nova.scheduler.client.report [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Deleted allocations for instance e9f787fc-98be-4086-9b70-ebbf33e31d13 [ 699.701694] env[68492]: DEBUG oslo_concurrency.lockutils [None req-19e7149a-2877-47d5-8c37-3ac19f541cba tempest-ServersAaction247Test-855737930 tempest-ServersAaction247Test-855737930-project-member] Lock "e9f787fc-98be-4086-9b70-ebbf33e31d13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 137.708s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 699.714789] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 699.761361] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 699.761639] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 699.763031] env[68492]: INFO nova.compute.claims [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 700.155721] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-893142eb-98b9-46a8-a2e5-64e82b34882a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.163576] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-016e9f8d-91f9-4063-88aa-18fa8f6dc41b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.194974] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2cb2e8-3dfb-4443-9b3a-51996138dc02 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.202495] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7be21fab-0f33-4d3e-8e6f-db672f465ce3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.215399] env[68492]: DEBUG nova.compute.provider_tree [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 700.224150] env[68492]: DEBUG nova.scheduler.client.report [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 700.236424] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.475s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 700.236876] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 700.272213] env[68492]: DEBUG nova.compute.utils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 700.276211] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 700.276211] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 700.284105] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 700.354863] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 700.360859] env[68492]: DEBUG nova.policy [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79376135bf194e8c9cb58f6551c271f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bcef4c8de61b4a6a995bf7f3c7fabcec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 700.378581] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 700.378829] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 700.378984] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 700.379186] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 700.379331] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 700.379488] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 700.379698] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 700.379857] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 700.380030] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 700.380200] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 700.380456] env[68492]: DEBUG nova.virt.hardware [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 700.381283] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92f67f5d-6162-4b39-99c4-175d3d01ff90 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.389317] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337d9614-99f5-4c94-9ec3-938dae9154c1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.723946] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Successfully created port: 5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 701.374165] env[68492]: DEBUG nova.compute.manager [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Received event network-vif-plugged-5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 701.374436] env[68492]: DEBUG oslo_concurrency.lockutils [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] Acquiring lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.374586] env[68492]: DEBUG oslo_concurrency.lockutils [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.374749] env[68492]: DEBUG oslo_concurrency.lockutils [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.374911] env[68492]: DEBUG nova.compute.manager [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] No waiting events found dispatching network-vif-plugged-5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 701.375088] env[68492]: WARNING nova.compute.manager [req-4fdbc74d-8510-4eb9-b355-e93b3f5f6631 req-574ba70d-93ac-4695-b9e7-e9582ed11078 service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Received unexpected event network-vif-plugged-5ff76d18-b9f4-4cc6-863a-9fe14bb879ac for instance with vm_state building and task_state spawning. [ 701.395442] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Successfully updated port: 5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 701.411965] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 701.412088] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquired lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 701.412242] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 701.465521] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 701.637465] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Updating instance_info_cache with network_info: [{"id": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "address": "fa:16:3e:92:e3:f3", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff76d18-b9", "ovs_interfaceid": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.647372] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Releasing lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 701.647658] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance network_info: |[{"id": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "address": "fa:16:3e:92:e3:f3", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff76d18-b9", "ovs_interfaceid": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 701.648058] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:92:e3:f3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5ff76d18-b9f4-4cc6-863a-9fe14bb879ac', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 701.655558] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Creating folder: Project (bcef4c8de61b4a6a995bf7f3c7fabcec). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.656106] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf63855d-f972-4642-b74c-20a69eff5e0d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.668999] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Created folder: Project (bcef4c8de61b4a6a995bf7f3c7fabcec) in parent group-v677434. [ 701.668999] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Creating folder: Instances. Parent ref: group-v677469. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.668999] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5a0eb5ab-ab93-4398-aa56-36554b22ddc3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.678051] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Created folder: Instances in parent group-v677469. [ 701.678212] env[68492]: DEBUG oslo.service.loopingcall [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 701.678505] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 701.678730] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2012a386-875b-40a4-bb5f-fdacd7e6c6d7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.698744] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 701.698744] env[68492]: value = "task-3395363" [ 701.698744] env[68492]: _type = "Task" [ 701.698744] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 701.706147] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395363, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 702.215196] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395363, 'name': CreateVM_Task, 'duration_secs': 0.278377} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 702.215196] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 702.216117] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.216368] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.216740] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 702.217385] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d791db93-d368-408b-8b4a-9bf55528ce01 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.226722] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for the task: (returnval){ [ 702.226722] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e534d4-d312-0b3a-d67f-8cb1ea33ab5c" [ 702.226722] env[68492]: _type = "Task" [ 702.226722] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.237720] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e534d4-d312-0b3a-d67f-8cb1ea33ab5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 702.742703] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.742703] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 702.742703] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.423421] env[68492]: DEBUG nova.compute.manager [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Received event network-changed-5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 703.423618] env[68492]: DEBUG nova.compute.manager [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Refreshing instance network info cache due to event network-changed-5ff76d18-b9f4-4cc6-863a-9fe14bb879ac. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 703.423851] env[68492]: DEBUG oslo_concurrency.lockutils [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] Acquiring lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.424065] env[68492]: DEBUG oslo_concurrency.lockutils [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] Acquired lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.424176] env[68492]: DEBUG nova.network.neutron [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Refreshing network info cache for port 5ff76d18-b9f4-4cc6-863a-9fe14bb879ac {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 703.882868] env[68492]: DEBUG nova.network.neutron [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Updated VIF entry in instance network info cache for port 5ff76d18-b9f4-4cc6-863a-9fe14bb879ac. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 703.883222] env[68492]: DEBUG nova.network.neutron [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Updating instance_info_cache with network_info: [{"id": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "address": "fa:16:3e:92:e3:f3", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ff76d18-b9", "ovs_interfaceid": "5ff76d18-b9f4-4cc6-863a-9fe14bb879ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.896074] env[68492]: DEBUG oslo_concurrency.lockutils [req-ccab8c4d-0b98-4dda-82dc-569415c8d18a req-b354b974-6e09-4afe-b79a-c1e0d624e7ee service nova] Releasing lock "refresh_cache-cbddbd81-2931-4d28-bd69-ef3f8f1e366c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 705.378396] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "569b49ff-047a-4494-b869-6598764da9d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.378684] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 746.067511] env[68492]: WARNING oslo_vmware.rw_handles [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 746.067511] env[68492]: ERROR oslo_vmware.rw_handles [ 746.068068] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 746.069938] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 746.070248] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Copying Virtual Disk [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/56221e05-b262-4e79-87ef-48785e4ca572/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 746.070545] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d699899f-26a8-4408-8e6a-007dd1e49c6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.079989] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Waiting for the task: (returnval){ [ 746.079989] env[68492]: value = "task-3395364" [ 746.079989] env[68492]: _type = "Task" [ 746.079989] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 746.087256] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Task: {'id': task-3395364, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 746.593030] env[68492]: DEBUG oslo_vmware.exceptions [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 746.593030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 746.593030] env[68492]: ERROR nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.593030] env[68492]: Faults: ['InvalidArgument'] [ 746.593030] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Traceback (most recent call last): [ 746.593030] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 746.593030] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] yield resources [ 746.593030] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 746.593030] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self.driver.spawn(context, instance, image_meta, [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self._fetch_image_if_missing(context, vi) [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] image_cache(vi, tmp_image_ds_loc) [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] vm_util.copy_virtual_disk( [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] session._wait_for_task(vmdk_copy_task) [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return self.wait_for_task(task_ref) [ 746.593350] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return evt.wait() [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] result = hub.switch() [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return self.greenlet.switch() [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self.f(*self.args, **self.kw) [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] raise exceptions.translate_fault(task_info.error) [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Faults: ['InvalidArgument'] [ 746.593663] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] [ 746.593992] env[68492]: INFO nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Terminating instance [ 746.593992] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 746.593992] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 746.593992] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af24537b-86f3-409a-8601-df4b6b3ee093 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.596144] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 746.596581] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 746.597086] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abb84723-11f1-4285-b49f-ffa4fc299757 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.604034] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 746.604145] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-73eddf56-7d60-46b4-9537-6e5daebd2152 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.606379] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 746.606554] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 746.607476] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5d329d8-f2a0-456b-a0ac-a6d33520553e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.612059] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for the task: (returnval){ [ 746.612059] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]529abeb3-4b22-ba7c-c10e-1414356735ef" [ 746.612059] env[68492]: _type = "Task" [ 746.612059] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 746.619114] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]529abeb3-4b22-ba7c-c10e-1414356735ef, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 746.673033] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 746.673199] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 746.673388] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Deleting the datastore file [datastore2] 02050238-c4a5-4c06-952d-06af14ff7d35 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 746.673649] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e641f0db-d48a-4d58-8f14-a0ba893da76d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.679325] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Waiting for the task: (returnval){ [ 746.679325] env[68492]: value = "task-3395366" [ 746.679325] env[68492]: _type = "Task" [ 746.679325] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 746.686715] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Task: {'id': task-3395366, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 747.124327] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 747.124632] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Creating directory with path [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 747.124839] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4eb3799a-3117-45ab-b0d0-92ca2e2e305f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.136161] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Created directory with path [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 747.138338] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Fetch image to [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 747.138338] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 747.138338] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eefaa59-5a12-4954-8c15-cd3cf2a237cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.144592] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3fcfa70-a3e3-437f-921c-5c013887ad03 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.153456] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56fcd783-7aec-42ad-9a88-289d337c7dae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.187226] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4e2ea80-6232-4056-8f0f-7f6d0c8b190b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.194388] env[68492]: DEBUG oslo_vmware.api [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Task: {'id': task-3395366, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074135} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 747.195866] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 747.196067] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 747.196246] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 747.196414] env[68492]: INFO nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Took 0.60 seconds to destroy the instance on the hypervisor. [ 747.198152] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ab3303fe-7e1c-4bcc-a96e-1e658c0ad7b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.200049] env[68492]: DEBUG nova.compute.claims [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 747.200234] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 747.200447] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 747.287135] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 747.343440] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 747.405099] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 747.405299] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 747.689339] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699881b7-3e64-4796-a718-1db9a54a9503 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.698789] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5154f5de-1f09-49cf-bd9a-b318c9a0d7d5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.731188] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695428ff-420d-4ca5-a0b4-9fa512257907 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.738156] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b62b278-f088-48aa-b58e-775467ed9e65 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.752137] env[68492]: DEBUG nova.compute.provider_tree [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 747.760691] env[68492]: DEBUG nova.scheduler.client.report [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 747.775632] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.575s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 747.777045] env[68492]: ERROR nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 747.777045] env[68492]: Faults: ['InvalidArgument'] [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Traceback (most recent call last): [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self.driver.spawn(context, instance, image_meta, [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self._fetch_image_if_missing(context, vi) [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] image_cache(vi, tmp_image_ds_loc) [ 747.777045] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] vm_util.copy_virtual_disk( [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] session._wait_for_task(vmdk_copy_task) [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return self.wait_for_task(task_ref) [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return evt.wait() [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] result = hub.switch() [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] return self.greenlet.switch() [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 747.777486] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] self.f(*self.args, **self.kw) [ 747.777805] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 747.777805] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] raise exceptions.translate_fault(task_info.error) [ 747.777805] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 747.777805] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Faults: ['InvalidArgument'] [ 747.777805] env[68492]: ERROR nova.compute.manager [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] [ 747.777805] env[68492]: DEBUG nova.compute.utils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 747.778634] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Build of instance 02050238-c4a5-4c06-952d-06af14ff7d35 was re-scheduled: A specified parameter was not correct: fileType [ 747.778634] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 747.779067] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 747.779288] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 747.779490] env[68492]: DEBUG nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 747.779685] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 748.085507] env[68492]: DEBUG nova.network.neutron [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.098020] env[68492]: INFO nova.compute.manager [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] Took 0.32 seconds to deallocate network for instance. [ 748.201437] env[68492]: INFO nova.scheduler.client.report [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Deleted allocations for instance 02050238-c4a5-4c06-952d-06af14ff7d35 [ 748.234600] env[68492]: DEBUG oslo_concurrency.lockutils [None req-926273ff-6ebc-43b7-b1a1-8266ee6d90c3 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "02050238-c4a5-4c06-952d-06af14ff7d35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 188.742s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.236030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "02050238-c4a5-4c06-952d-06af14ff7d35" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 188.003s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 748.236251] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 02050238-c4a5-4c06-952d-06af14ff7d35] During sync_power_state the instance has a pending task (spawning). Skip. [ 748.237284] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "02050238-c4a5-4c06-952d-06af14ff7d35" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.249815] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 748.318578] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 748.318851] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 748.320447] env[68492]: INFO nova.compute.claims [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 748.751495] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75cbcd7c-cac4-4de8-b2c6-988cb6fd7fcb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.759670] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-473daa89-8f59-4004-96ad-6152a5b4bdcb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.788995] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0de9c569-4a25-4862-8ddf-09528385ad1f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.796434] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c705b1e-142b-4f7c-9586-272cfe866ef7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.809539] env[68492]: DEBUG nova.compute.provider_tree [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.823036] env[68492]: DEBUG nova.scheduler.client.report [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.840526] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.521s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.840638] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 748.875664] env[68492]: DEBUG nova.compute.utils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 748.877115] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 748.877309] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 748.887181] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 748.954729] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 748.979489] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:56:18Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1300342334',id=26,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-814109353',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 748.979662] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 748.979938] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 748.980177] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 748.980328] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 748.980472] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 748.980677] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 748.980835] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 748.980997] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 748.981316] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 748.981610] env[68492]: DEBUG nova.virt.hardware [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 748.982630] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f98d7a3-1848-4a96-9bff-9856f5076da4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.991132] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7638cbdc-8bb0-4024-92e7-8e29df90fe9e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.122498] env[68492]: DEBUG nova.policy [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '42bd19f2ed9d4743b7cf72d929f9fddb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2550092b36d847b295d4db50cd6063ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 749.461156] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Successfully created port: 5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 749.758713] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0d3f650b-ef47-4541-be9f-32f35f198681 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Acquiring lock "eae1ea40-8ebd-4b7a-9489-e0e70653a517" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 749.758970] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0d3f650b-ef47-4541-be9f-32f35f198681 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "eae1ea40-8ebd-4b7a-9489-e0e70653a517" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 750.042152] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Successfully updated port: 5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 750.058246] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 750.058407] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquired lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 750.058735] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 750.097031] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 750.275619] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Updating instance_info_cache with network_info: [{"id": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "address": "fa:16:3e:e5:44:43", "network": {"id": "09c213ae-357d-44ec-b251-5128ba28ba69", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1944777086-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2550092b36d847b295d4db50cd6063ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a13eb2e-47", "ovs_interfaceid": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.287511] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Releasing lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 750.287790] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance network_info: |[{"id": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "address": "fa:16:3e:e5:44:43", "network": {"id": "09c213ae-357d-44ec-b251-5128ba28ba69", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1944777086-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2550092b36d847b295d4db50cd6063ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a13eb2e-47", "ovs_interfaceid": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 750.288202] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e5:44:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '62237242-7ce2-4664-a1c5-6783b516b507', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5a13eb2e-47e1-4edf-a479-ecbb628b4176', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 750.296277] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Creating folder: Project (2550092b36d847b295d4db50cd6063ae). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 750.296808] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c8a3fae2-1ec2-4a78-ab8f-b6f0ab462248 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.309017] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Created folder: Project (2550092b36d847b295d4db50cd6063ae) in parent group-v677434. [ 750.309202] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Creating folder: Instances. Parent ref: group-v677472. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 750.309423] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc3dfeda-4c75-4021-a26a-e3b77272f058 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.317803] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Created folder: Instances in parent group-v677472. [ 750.318028] env[68492]: DEBUG oslo.service.loopingcall [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 750.318207] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 750.318388] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6def671e-ebba-4254-8742-fdb26704944b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.339485] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 750.339485] env[68492]: value = "task-3395369" [ 750.339485] env[68492]: _type = "Task" [ 750.339485] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 750.347614] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395369, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 750.507429] env[68492]: DEBUG nova.compute.manager [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Received event network-vif-plugged-5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 750.507687] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Acquiring lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 750.507922] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 750.508166] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 750.508360] env[68492]: DEBUG nova.compute.manager [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] No waiting events found dispatching network-vif-plugged-5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 750.508491] env[68492]: WARNING nova.compute.manager [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Received unexpected event network-vif-plugged-5a13eb2e-47e1-4edf-a479-ecbb628b4176 for instance with vm_state building and task_state spawning. [ 750.508662] env[68492]: DEBUG nova.compute.manager [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Received event network-changed-5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 750.508819] env[68492]: DEBUG nova.compute.manager [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Refreshing instance network info cache due to event network-changed-5a13eb2e-47e1-4edf-a479-ecbb628b4176. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 750.509045] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Acquiring lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 750.509214] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Acquired lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 750.509407] env[68492]: DEBUG nova.network.neutron [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Refreshing network info cache for port 5a13eb2e-47e1-4edf-a479-ecbb628b4176 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 750.762716] env[68492]: DEBUG nova.network.neutron [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Updated VIF entry in instance network info cache for port 5a13eb2e-47e1-4edf-a479-ecbb628b4176. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 750.763099] env[68492]: DEBUG nova.network.neutron [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Updating instance_info_cache with network_info: [{"id": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "address": "fa:16:3e:e5:44:43", "network": {"id": "09c213ae-357d-44ec-b251-5128ba28ba69", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1944777086-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2550092b36d847b295d4db50cd6063ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "62237242-7ce2-4664-a1c5-6783b516b507", "external-id": "nsx-vlan-transportzone-295", "segmentation_id": 295, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5a13eb2e-47", "ovs_interfaceid": "5a13eb2e-47e1-4edf-a479-ecbb628b4176", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.772601] env[68492]: DEBUG oslo_concurrency.lockutils [req-e993306d-a943-4fec-ad77-2dd2e546b2d4 req-36353869-66ea-4f1a-a6fc-b82c725cca2f service nova] Releasing lock "refresh_cache-fcf9c3f0-4f46-4069-887f-fd666e6b3c53" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 750.849256] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395369, 'name': CreateVM_Task, 'duration_secs': 0.290748} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 750.849438] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 750.850116] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 750.850297] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 750.850608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 750.851398] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78d232d1-863d-4574-b399-6c95e2ea8d90 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 750.855400] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for the task: (returnval){ [ 750.855400] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525926a1-1b13-762c-eee4-d16346cb4817" [ 750.855400] env[68492]: _type = "Task" [ 750.855400] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 750.862949] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525926a1-1b13-762c-eee4-d16346cb4817, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 751.365416] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 751.365677] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 751.365885] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 752.231276] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.254130] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.231982] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.232357] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.232550] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.232712] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.232867] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.246172] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 754.246420] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.246587] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 754.246741] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 754.248344] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39caba26-acaf-45f4-9ed6-62411d7ae8ff {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.258422] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13acd37e-b65c-4185-87ec-e58c7ae9124b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.273014] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eb51447-bd7e-4123-a649-336305f792f7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.279525] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a65e7b8-039c-4190-9606-c5ce95bdfc53 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.310024] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180978MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 754.310199] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 754.310394] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.387651] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3de34725-4b54-4956-b2b6-285c9138e94c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.387817] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.387946] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f3c94673-a8fc-4ead-9907-4347cd6244ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388084] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388206] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388322] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388437] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388551] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388663] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.388773] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 754.400104] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 26967217-559c-4987-ba55-6eb1ff782b24 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.410676] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e8f36d0a-e116-4bc4-91a4-a6c463a6c373 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.420841] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f71b71d9-18c5-4715-ad3b-9d7ac2063d31 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.430722] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f73c13d0-db0e-4a74-9ece-62f364bf8383 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.442039] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.454025] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 431adf1d-c988-4832-96c1-6d7ae8de0745 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.465850] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 20538544-eb9b-4f0e-a49e-120fc721f651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.474762] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.484994] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d947bb3a-3877-4628-9b83-8d380b47261d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.494879] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1509151e-59a9-41b2-ad52-22a5d888bd5d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.504812] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.516661] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aae38f8c-fe29-478b-946a-1f75bb9434a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.526952] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e410e6fa-7652-45d1-8ec1-f1c1db5c728f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.537797] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e7c66cb6-10fc-44d4-9821-6e3141e04024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.548993] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 31f0fab8-123f-4857-93a7-517ac44dbf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.559564] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d720fc20-a7a6-4826-9174-2fb12bb0a6c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.570233] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2590f6bd-a48f-49ad-b955-a0ebec9d31e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.582749] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9d15dfea-323f-4007-91cb-0a0b64d60a5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.594519] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 81d59156-2869-4045-a2d3-349e6077f477 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.608184] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1ee59a29-0ef7-4906-a027-90992418c3fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.618637] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.629136] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.638544] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance eae1ea40-8ebd-4b7a-9489-e0e70653a517 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 754.638784] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 754.638930] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 754.994804] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c77b0229-a45e-4cac-bde3-9344793334a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.002284] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-079fc630-e589-4b98-9243-25017f77ff05 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.031309] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad031941-be6e-48ec-a179-553bad103bbb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.038528] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0501e1a2-7129-4624-a457-afdfd33d8806 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.051666] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 755.060354] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 755.076369] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 755.076435] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.766s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 756.074675] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.075040] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.075214] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 756.231470] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.231639] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 756.231765] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 756.251720] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.251876] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252040] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252180] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252304] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252425] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252543] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252660] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252776] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.252891] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 756.253054] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 760.268379] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "3de34725-4b54-4956-b2b6-285c9138e94c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 771.177181] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 773.817628] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "b7e0d1c7-d21b-42c1-b400-86be946df689" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 773.933956] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 776.474573] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "12450355-d90e-40dc-b66f-6105ec320d19" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.657739] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "14af3749-f031-4543-96e4-af0b4fd28e2b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 787.549928] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.338720] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.904899] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.409657] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "8c72085d-697c-4829-866a-4d642f18d2f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.409999] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 797.048709] env[68492]: WARNING oslo_vmware.rw_handles [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 797.048709] env[68492]: ERROR oslo_vmware.rw_handles [ 797.052027] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 797.052772] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 797.053153] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Copying Virtual Disk [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/38169ba8-18e3-4d7c-8163-358b109b6bf1/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 797.053598] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-826e662e-286a-4138-93d4-1785a7cd8e04 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.062028] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for the task: (returnval){ [ 797.062028] env[68492]: value = "task-3395370" [ 797.062028] env[68492]: _type = "Task" [ 797.062028] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 797.071858] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Task: {'id': task-3395370, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 797.572323] env[68492]: DEBUG oslo_vmware.exceptions [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 797.572796] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 797.573189] env[68492]: ERROR nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.573189] env[68492]: Faults: ['InvalidArgument'] [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Traceback (most recent call last): [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] yield resources [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self.driver.spawn(context, instance, image_meta, [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self._fetch_image_if_missing(context, vi) [ 797.573189] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] image_cache(vi, tmp_image_ds_loc) [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] vm_util.copy_virtual_disk( [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] session._wait_for_task(vmdk_copy_task) [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return self.wait_for_task(task_ref) [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return evt.wait() [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] result = hub.switch() [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.573833] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return self.greenlet.switch() [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self.f(*self.args, **self.kw) [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] raise exceptions.translate_fault(task_info.error) [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Faults: ['InvalidArgument'] [ 797.574411] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] [ 797.574411] env[68492]: INFO nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Terminating instance [ 797.575801] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 797.575991] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.576450] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 797.576657] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.577414] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-736d9e2b-95ec-45b8-b8c2-ebbc2e3286ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.580414] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1196c5ea-59f1-4966-8e7f-d6be6e551903 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.587093] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 797.587389] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ec535f6e-43b3-4c8b-a0da-a7c109afffd3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.589718] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.589892] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 797.590873] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dffdabac-dc4b-4346-b1d1-81b1e434247f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.596020] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for the task: (returnval){ [ 797.596020] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520b42c9-009d-f99b-09b9-613fbd038f3f" [ 797.596020] env[68492]: _type = "Task" [ 797.596020] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 797.603956] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520b42c9-009d-f99b-09b9-613fbd038f3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 797.669857] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 797.670120] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 797.670218] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Deleting the datastore file [datastore2] 3de34725-4b54-4956-b2b6-285c9138e94c {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 797.670485] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4558291d-c263-4140-8dfb-02366ffd7f29 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.677384] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for the task: (returnval){ [ 797.677384] env[68492]: value = "task-3395372" [ 797.677384] env[68492]: _type = "Task" [ 797.677384] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 797.686361] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Task: {'id': task-3395372, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 798.110373] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 798.110373] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Creating directory with path [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 798.110373] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1023cc57-39d5-43f1-9672-4a15f77437c9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.123733] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Created directory with path [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 798.123970] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Fetch image to [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 798.125550] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 798.125550] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-197995a0-4de6-42fc-afd4-abf4956788ec {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.137340] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53c3b212-20c0-41c8-b920-40f58699aadc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.147375] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc1518ab-6b4f-4717-9e84-7532a3aa42b8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.196765] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c53128dd-56cb-430b-8701-63069b179f7e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.203084] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fa220eb-d4a4-41c9-a4c2-e897af89ef90 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "4f4669ef-c7da-4f9a-9ebe-83947f00863a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.203084] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fa220eb-d4a4-41c9-a4c2-e897af89ef90 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4f4669ef-c7da-4f9a-9ebe-83947f00863a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.210033] env[68492]: DEBUG oslo_vmware.api [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Task: {'id': task-3395372, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.095754} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 798.210315] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 798.210483] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 798.210694] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 798.210808] env[68492]: INFO nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Took 0.63 seconds to destroy the instance on the hypervisor. [ 798.212329] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e2acb01d-66c1-44e7-911f-4c38a76741ca {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.214473] env[68492]: DEBUG nova.compute.claims [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 798.214691] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.215216] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.306170] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 798.426103] env[68492]: DEBUG oslo_vmware.rw_handles [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 798.488480] env[68492]: DEBUG oslo_vmware.rw_handles [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 798.488480] env[68492]: DEBUG oslo_vmware.rw_handles [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 798.859138] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c275d09-b95b-4984-bfeb-0b43a033a035 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.868392] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f8b97d6-559d-486b-b720-185d628b188b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.902897] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f27391c6-cdcf-4eb6-9b30-b9eadea6c739 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.913121] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eb603c5-9d4c-4df6-8359-a835c21809f7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.925652] env[68492]: DEBUG nova.compute.provider_tree [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.936363] env[68492]: DEBUG nova.scheduler.client.report [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.952539] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.737s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.953101] env[68492]: ERROR nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.953101] env[68492]: Faults: ['InvalidArgument'] [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Traceback (most recent call last): [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self.driver.spawn(context, instance, image_meta, [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self._fetch_image_if_missing(context, vi) [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] image_cache(vi, tmp_image_ds_loc) [ 798.953101] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] vm_util.copy_virtual_disk( [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] session._wait_for_task(vmdk_copy_task) [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return self.wait_for_task(task_ref) [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return evt.wait() [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] result = hub.switch() [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] return self.greenlet.switch() [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 798.953459] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] self.f(*self.args, **self.kw) [ 798.953784] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 798.953784] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] raise exceptions.translate_fault(task_info.error) [ 798.953784] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.953784] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Faults: ['InvalidArgument'] [ 798.953784] env[68492]: ERROR nova.compute.manager [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] [ 798.953917] env[68492]: DEBUG nova.compute.utils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 798.955445] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Build of instance 3de34725-4b54-4956-b2b6-285c9138e94c was re-scheduled: A specified parameter was not correct: fileType [ 798.955445] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 798.955983] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 798.956063] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 798.956246] env[68492]: DEBUG nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 798.956420] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 799.747103] env[68492]: DEBUG nova.network.neutron [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 799.762111] env[68492]: INFO nova.compute.manager [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Took 0.81 seconds to deallocate network for instance. [ 799.883025] env[68492]: INFO nova.scheduler.client.report [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Deleted allocations for instance 3de34725-4b54-4956-b2b6-285c9138e94c [ 799.914594] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d7e1888a-6f88-4a69-8b0e-73e1e3a34c0f tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.096s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 799.916785] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 39.648s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.916785] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Acquiring lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.917076] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.917388] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 799.920135] env[68492]: INFO nova.compute.manager [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Terminating instance [ 799.926981] env[68492]: DEBUG nova.compute.manager [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 799.927230] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 799.931949] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7595a5ec-0efc-4207-aa56-bec93b068c88 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.937919] env[68492]: DEBUG nova.compute.manager [None req-66c017ed-8ec9-4027-92c3-9c61b16862de tempest-InstanceActionsV221TestJSON-723775731 tempest-InstanceActionsV221TestJSON-723775731-project-member] [instance: 26967217-559c-4987-ba55-6eb1ff782b24] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 799.943586] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8d45694-c4fb-4894-bc6c-3593ea71ce25 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.987020] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3de34725-4b54-4956-b2b6-285c9138e94c could not be found. [ 799.987020] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 799.987020] env[68492]: INFO nova.compute.manager [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Took 0.06 seconds to destroy the instance on the hypervisor. [ 799.987020] env[68492]: DEBUG oslo.service.loopingcall [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 799.987585] env[68492]: DEBUG nova.compute.manager [None req-66c017ed-8ec9-4027-92c3-9c61b16862de tempest-InstanceActionsV221TestJSON-723775731 tempest-InstanceActionsV221TestJSON-723775731-project-member] [instance: 26967217-559c-4987-ba55-6eb1ff782b24] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 799.987585] env[68492]: DEBUG nova.compute.manager [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 799.987585] env[68492]: DEBUG nova.network.neutron [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 800.032819] env[68492]: DEBUG oslo_concurrency.lockutils [None req-66c017ed-8ec9-4027-92c3-9c61b16862de tempest-InstanceActionsV221TestJSON-723775731 tempest-InstanceActionsV221TestJSON-723775731-project-member] Lock "26967217-559c-4987-ba55-6eb1ff782b24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.009s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.038844] env[68492]: DEBUG nova.network.neutron [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 800.046732] env[68492]: INFO nova.compute.manager [-] [instance: 3de34725-4b54-4956-b2b6-285c9138e94c] Took 0.06 seconds to deallocate network for instance. [ 800.054544] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: e8f36d0a-e116-4bc4-91a4-a6c463a6c373] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.089026] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: e8f36d0a-e116-4bc4-91a4-a6c463a6c373] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.147321] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "e8f36d0a-e116-4bc4-91a4-a6c463a6c373" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.407s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.167227] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: f71b71d9-18c5-4715-ad3b-9d7ac2063d31] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.221372] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: f71b71d9-18c5-4715-ad3b-9d7ac2063d31] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.245205] env[68492]: DEBUG oslo_concurrency.lockutils [None req-932eb5fd-4ffb-4875-b803-f8f8add4ddf3 tempest-ImagesOneServerTestJSON-285364769 tempest-ImagesOneServerTestJSON-285364769-project-member] Lock "3de34725-4b54-4956-b2b6-285c9138e94c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.329s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.256343] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "f71b71d9-18c5-4715-ad3b-9d7ac2063d31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.489s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.276700] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: f73c13d0-db0e-4a74-9ece-62f364bf8383] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.302186] env[68492]: DEBUG nova.compute.manager [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] [instance: f73c13d0-db0e-4a74-9ece-62f364bf8383] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.343019] env[68492]: DEBUG oslo_concurrency.lockutils [None req-84a89e89-9f18-4df7-a0e2-38abc24a02d3 tempest-ListServersNegativeTestJSON-1773263508 tempest-ListServersNegativeTestJSON-1773263508-project-member] Lock "f73c13d0-db0e-4a74-9ece-62f364bf8383" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.546s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.357831] env[68492]: DEBUG nova.compute.manager [None req-541f78a2-d337-4fdd-b8c4-42d37871c3e7 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.395041] env[68492]: DEBUG nova.compute.manager [None req-541f78a2-d337-4fdd-b8c4-42d37871c3e7 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.434046] env[68492]: DEBUG oslo_concurrency.lockutils [None req-541f78a2-d337-4fdd-b8c4-42d37871c3e7 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "54c80b2a-d2dc-4303-a2e3-e597c9a9d2d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.060s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.448658] env[68492]: DEBUG nova.compute.manager [None req-add155ab-b916-4fcf-9c47-10a8b210eec6 tempest-ServersNegativeTestJSON-1148478936 tempest-ServersNegativeTestJSON-1148478936-project-member] [instance: 431adf1d-c988-4832-96c1-6d7ae8de0745] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.472676] env[68492]: DEBUG nova.compute.manager [None req-add155ab-b916-4fcf-9c47-10a8b210eec6 tempest-ServersNegativeTestJSON-1148478936 tempest-ServersNegativeTestJSON-1148478936-project-member] [instance: 431adf1d-c988-4832-96c1-6d7ae8de0745] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.504156] env[68492]: DEBUG oslo_concurrency.lockutils [None req-add155ab-b916-4fcf-9c47-10a8b210eec6 tempest-ServersNegativeTestJSON-1148478936 tempest-ServersNegativeTestJSON-1148478936-project-member] Lock "431adf1d-c988-4832-96c1-6d7ae8de0745" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.523s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.514748] env[68492]: DEBUG nova.compute.manager [None req-70544937-df53-4e6d-bb4e-2c2e455cc650 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 20538544-eb9b-4f0e-a49e-120fc721f651] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.576607] env[68492]: DEBUG nova.compute.manager [None req-70544937-df53-4e6d-bb4e-2c2e455cc650 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 20538544-eb9b-4f0e-a49e-120fc721f651] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.599735] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70544937-df53-4e6d-bb4e-2c2e455cc650 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "20538544-eb9b-4f0e-a49e-120fc721f651" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.285s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.616152] env[68492]: DEBUG nova.compute.manager [None req-a226b7a1-69ef-4c35-9f03-0504fb3f179f tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] [instance: ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.648069] env[68492]: DEBUG nova.compute.manager [None req-a226b7a1-69ef-4c35-9f03-0504fb3f179f tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] [instance: ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.670177] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a226b7a1-69ef-4c35-9f03-0504fb3f179f tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Lock "ab6a6bdb-f4c0-4e85-a478-b3d14ee8a1e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.660s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.684755] env[68492]: DEBUG nova.compute.manager [None req-18bee2f4-316a-4c17-8fe2-bc3722cc6928 tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] [instance: d947bb3a-3877-4628-9b83-8d380b47261d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.710199] env[68492]: DEBUG nova.compute.manager [None req-18bee2f4-316a-4c17-8fe2-bc3722cc6928 tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] [instance: d947bb3a-3877-4628-9b83-8d380b47261d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.738329] env[68492]: DEBUG oslo_concurrency.lockutils [None req-18bee2f4-316a-4c17-8fe2-bc3722cc6928 tempest-ServersAdminTestJSON-2049836212 tempest-ServersAdminTestJSON-2049836212-project-member] Lock "d947bb3a-3877-4628-9b83-8d380b47261d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.225s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.754023] env[68492]: DEBUG nova.compute.manager [None req-b5df45b9-b527-4dbc-abca-981cf8bb032a tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 1509151e-59a9-41b2-ad52-22a5d888bd5d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.784647] env[68492]: DEBUG nova.compute.manager [None req-b5df45b9-b527-4dbc-abca-981cf8bb032a tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 1509151e-59a9-41b2-ad52-22a5d888bd5d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 800.822231] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b5df45b9-b527-4dbc-abca-981cf8bb032a tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "1509151e-59a9-41b2-ad52-22a5d888bd5d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.079s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.840733] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 800.913437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 800.913701] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 800.915342] env[68492]: INFO nova.compute.claims [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 801.073659] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c4c57657-212f-4931-a6fb-6f36858f9df1 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "09401266-1c03-4c2e-b850-e7196bcb1e9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 801.073950] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c4c57657-212f-4931-a6fb-6f36858f9df1 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "09401266-1c03-4c2e-b850-e7196bcb1e9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.407381] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f5a3ed5-0ab7-43d4-8b5f-7189e7bc5d66 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.421812] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037e711b-3865-4786-8843-6f8c0a1a0578 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.457425] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a48de1b-cef5-4031-bb6a-03890ebbccd2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.465263] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aacc3a59-9ce5-4ce7-b39c-4e75a4e83cb0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.481625] env[68492]: DEBUG nova.compute.provider_tree [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 801.492820] env[68492]: DEBUG nova.scheduler.client.report [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 801.515769] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.602s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.516408] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 801.565041] env[68492]: DEBUG nova.compute.utils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 801.566186] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 801.566521] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 801.577399] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 801.663954] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 801.683975] env[68492]: DEBUG nova.policy [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '08124891031848f3a1f199b4f5a0be7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ad369803433345d19845ca06e123423b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 801.695799] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 801.696048] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 801.696214] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 801.696560] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 801.696560] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 801.696705] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 801.696872] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 801.697105] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 801.697255] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 801.698445] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 801.698445] env[68492]: DEBUG nova.virt.hardware [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 801.698445] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4059ccf1-b9fc-43d0-b7ed-699091d6cf17 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.707353] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0001d61f-195e-4a4f-b0c7-d9da3c09f88e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.474539] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Successfully created port: ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 803.636324] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Successfully updated port: ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 803.648352] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 803.648352] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquired lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 803.649148] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 803.698069] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 804.065592] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Updating instance_info_cache with network_info: [{"id": "ec2b070d-9441-45f9-a8c0-a6506860a401", "address": "fa:16:3e:82:ba:1a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec2b070d-94", "ovs_interfaceid": "ec2b070d-9441-45f9-a8c0-a6506860a401", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 804.082917] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Releasing lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 804.083140] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance network_info: |[{"id": "ec2b070d-9441-45f9-a8c0-a6506860a401", "address": "fa:16:3e:82:ba:1a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec2b070d-94", "ovs_interfaceid": "ec2b070d-9441-45f9-a8c0-a6506860a401", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 804.083574] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:82:ba:1a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec2b070d-9441-45f9-a8c0-a6506860a401', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 804.092450] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Creating folder: Project (ad369803433345d19845ca06e123423b). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.093630] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6670c398-3951-4f7b-80be-e5d3c6c146d7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.110227] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Created folder: Project (ad369803433345d19845ca06e123423b) in parent group-v677434. [ 804.110227] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Creating folder: Instances. Parent ref: group-v677475. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.110227] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37a40825-a5c2-4958-a316-ed83ddb74d5e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.116889] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Created folder: Instances in parent group-v677475. [ 804.117155] env[68492]: DEBUG oslo.service.loopingcall [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 804.117352] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 804.117636] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b4b7aa78-9884-40df-a7b3-03ff10d4f155 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.141185] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 804.141185] env[68492]: value = "task-3395375" [ 804.141185] env[68492]: _type = "Task" [ 804.141185] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 804.153117] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395375, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 804.304890] env[68492]: DEBUG nova.compute.manager [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Received event network-vif-plugged-ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 804.304998] env[68492]: DEBUG oslo_concurrency.lockutils [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] Acquiring lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 804.308239] env[68492]: DEBUG oslo_concurrency.lockutils [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 804.308239] env[68492]: DEBUG oslo_concurrency.lockutils [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 804.308239] env[68492]: DEBUG nova.compute.manager [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] No waiting events found dispatching network-vif-plugged-ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 804.308481] env[68492]: WARNING nova.compute.manager [req-74d9c267-f0da-4242-816c-191b83e30048 req-aa034c96-43a9-4aeb-b816-51fbd23d2c5e service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Received unexpected event network-vif-plugged-ec2b070d-9441-45f9-a8c0-a6506860a401 for instance with vm_state building and task_state spawning. [ 804.654063] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395375, 'name': CreateVM_Task, 'duration_secs': 0.284009} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 804.654333] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 804.655141] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 804.655215] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 804.655580] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 804.655883] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f579e79b-8c52-48e2-90dd-bc662fdfccf4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.660781] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Waiting for the task: (returnval){ [ 804.660781] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52095c32-7b24-81c0-5c34-0a8d5b08c537" [ 804.660781] env[68492]: _type = "Task" [ 804.660781] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 804.674221] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52095c32-7b24-81c0-5c34-0a8d5b08c537, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 805.008156] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.008431] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 805.174119] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 805.174543] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 805.175346] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 806.557082] env[68492]: DEBUG nova.compute.manager [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Received event network-changed-ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 806.557376] env[68492]: DEBUG nova.compute.manager [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Refreshing instance network info cache due to event network-changed-ec2b070d-9441-45f9-a8c0-a6506860a401. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 806.557540] env[68492]: DEBUG oslo_concurrency.lockutils [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] Acquiring lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 806.557643] env[68492]: DEBUG oslo_concurrency.lockutils [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] Acquired lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 806.557773] env[68492]: DEBUG nova.network.neutron [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Refreshing network info cache for port ec2b070d-9441-45f9-a8c0-a6506860a401 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 806.929405] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.392755] env[68492]: DEBUG nova.network.neutron [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Updated VIF entry in instance network info cache for port ec2b070d-9441-45f9-a8c0-a6506860a401. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 807.392755] env[68492]: DEBUG nova.network.neutron [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Updating instance_info_cache with network_info: [{"id": "ec2b070d-9441-45f9-a8c0-a6506860a401", "address": "fa:16:3e:82:ba:1a", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec2b070d-94", "ovs_interfaceid": "ec2b070d-9441-45f9-a8c0-a6506860a401", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.410883] env[68492]: DEBUG oslo_concurrency.lockutils [req-ed6eb0eb-5722-4291-b6d9-9a343ac72956 req-401faf53-805b-4ebd-af6c-77f2be19eb56 service nova] Releasing lock "refresh_cache-93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 811.231049] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.231329] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 811.245713] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 0 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 811.245901] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.246078] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 811.257356] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.816255] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f995aecf-0818-40f8-8b8f-1c361b1202e2 tempest-ServerPasswordTestJSON-1753985612 tempest-ServerPasswordTestJSON-1753985612-project-member] Acquiring lock "f48567a8-6b74-46ee-af6b-37823323e17f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.816255] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f995aecf-0818-40f8-8b8f-1c361b1202e2 tempest-ServerPasswordTestJSON-1753985612 tempest-ServerPasswordTestJSON-1753985612-project-member] Lock "f48567a8-6b74-46ee-af6b-37823323e17f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.266103] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 812.908356] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0a00c346-1b9a-42d6-871e-9b332b1662bc tempest-ServerActionsTestOtherB-352976159 tempest-ServerActionsTestOtherB-352976159-project-member] Acquiring lock "a59a286e-ad8c-4628-b326-09762dea3534" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.908590] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0a00c346-1b9a-42d6-871e-9b332b1662bc tempest-ServerActionsTestOtherB-352976159 tempest-ServerActionsTestOtherB-352976159-project-member] Lock "a59a286e-ad8c-4628-b326-09762dea3534" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.226530] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.231840] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.231840] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.231840] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.231840] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 815.231840] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.251933] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.253676] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.253968] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.254229] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 815.256192] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16421dea-74d7-4688-b80e-a5ead30f3249 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.265420] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f689dd6-53ea-41c2-8117-7d11ac9c822e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.281133] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26e341af-5444-4572-906e-eb57a48a14a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.288491] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0faa75f-5944-4c71-b6cc-9a72ec2b0e7b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.322020] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180978MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 815.322020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.322020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.497165] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497335] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f3c94673-a8fc-4ead-9907-4347cd6244ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497467] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497685] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497742] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497817] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.497950] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.498080] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.498165] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.498275] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 815.513974] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e7c66cb6-10fc-44d4-9821-6e3141e04024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.529215] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 31f0fab8-123f-4857-93a7-517ac44dbf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.541231] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance d720fc20-a7a6-4826-9174-2fb12bb0a6c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.556278] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2590f6bd-a48f-49ad-b955-a0ebec9d31e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.567916] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9d15dfea-323f-4007-91cb-0a0b64d60a5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.582057] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 81d59156-2869-4045-a2d3-349e6077f477 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.595891] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 1ee59a29-0ef7-4906-a027-90992418c3fb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.611032] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.622699] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.635657] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance eae1ea40-8ebd-4b7a-9489-e0e70653a517 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.650257] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.663329] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f4669ef-c7da-4f9a-9ebe-83947f00863a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.677249] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 09401266-1c03-4c2e-b850-e7196bcb1e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.693529] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.707544] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f48567a8-6b74-46ee-af6b-37823323e17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.723220] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a59a286e-ad8c-4628-b326-09762dea3534 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 815.725011] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 815.725011] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 815.747954] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 815.773437] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 815.776910] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 815.790333] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 815.811929] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 816.283256] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71fb0008-ad46-431b-93bf-5b3cf72955b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.291617] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c302d2fb-ceac-4a5c-ad05-0fc823c64500 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.326702] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dea66f6f-5674-4722-803d-41d6cffa072a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.334139] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71111389-d5c9-4020-910d-dc729f05321c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.347699] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.361224] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.388754] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 816.388754] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.067s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 817.388301] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 817.388702] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 817.388702] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 817.409074] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409253] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409373] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409501] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409626] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409748] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409868] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.409986] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.411406] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.411562] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 817.411689] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 817.412207] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 817.412397] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 817.792661] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6813a253-afce-4168-803d-2470c90de818 tempest-ServerActionsTestJSON-1562591659 tempest-ServerActionsTestJSON-1562591659-project-member] Acquiring lock "2598cded-78b6-4230-98c5-7068b429a56c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.792914] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6813a253-afce-4168-803d-2470c90de818 tempest-ServerActionsTestJSON-1562591659 tempest-ServerActionsTestJSON-1562591659-project-member] Lock "2598cded-78b6-4230-98c5-7068b429a56c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 821.128782] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0978fdad-7b06-4c3e-8104-0e06cce8ca05 tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Acquiring lock "ab820eba-d4d5-4b07-bc68-79c4b8cf46c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 821.129089] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0978fdad-7b06-4c3e-8104-0e06cce8ca05 tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "ab820eba-d4d5-4b07-bc68-79c4b8cf46c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.256242] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2858edaf-85d0-4282-8ac0-4604025c8ef5 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Acquiring lock "0de36474-6ab2-4c5c-a85c-5080d82b3f8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.256608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2858edaf-85d0-4282-8ac0-4604025c8ef5 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Lock "0de36474-6ab2-4c5c-a85c-5080d82b3f8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 834.704912] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bab765e0-3ba8-4cc5-9ca4-2dca7a8387e2 tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Acquiring lock "49db2997-6ee3-4cbd-b640-77ad352ae2fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 834.705232] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bab765e0-3ba8-4cc5-9ca4-2dca7a8387e2 tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Lock "49db2997-6ee3-4cbd-b640-77ad352ae2fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.314869] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4cd5f959-82eb-44fd-a937-2a168b111220 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Acquiring lock "dacc9b15-d2d0-4d7e-b419-eff947683f42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.315173] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4cd5f959-82eb-44fd-a937-2a168b111220 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "dacc9b15-d2d0-4d7e-b419-eff947683f42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.001033] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fb138079-a0e7-4e6a-bdf8-fade7e9e07ce tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Acquiring lock "b8f3a42e-9412-408f-bbbc-2d7a542bd82e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.001354] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fb138079-a0e7-4e6a-bdf8-fade7e9e07ce tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Lock "b8f3a42e-9412-408f-bbbc-2d7a542bd82e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 844.266728] env[68492]: DEBUG oslo_concurrency.lockutils [None req-277bafba-e318-4349-bec0-583423586f98 tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Acquiring lock "fc27ef4a-0a1d-49c7-b96d-5a57810117bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 844.267030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-277bafba-e318-4349-bec0-583423586f98 tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Lock "fc27ef4a-0a1d-49c7-b96d-5a57810117bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 847.203024] env[68492]: WARNING oslo_vmware.rw_handles [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 847.203024] env[68492]: ERROR oslo_vmware.rw_handles [ 847.203808] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 847.205139] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 847.205464] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Copying Virtual Disk [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/94491022-4ee6-458f-9227-d96b86c9ecaa/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 847.206759] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-67fee42a-7744-42f4-99c7-aa6729cc5a12 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.213490] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for the task: (returnval){ [ 847.213490] env[68492]: value = "task-3395376" [ 847.213490] env[68492]: _type = "Task" [ 847.213490] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.221254] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Task: {'id': task-3395376, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 847.723637] env[68492]: DEBUG oslo_vmware.exceptions [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 847.724089] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.724682] env[68492]: ERROR nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.724682] env[68492]: Faults: ['InvalidArgument'] [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Traceback (most recent call last): [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] yield resources [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self.driver.spawn(context, instance, image_meta, [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self._fetch_image_if_missing(context, vi) [ 847.724682] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] image_cache(vi, tmp_image_ds_loc) [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] vm_util.copy_virtual_disk( [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] session._wait_for_task(vmdk_copy_task) [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return self.wait_for_task(task_ref) [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return evt.wait() [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] result = hub.switch() [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.725094] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return self.greenlet.switch() [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self.f(*self.args, **self.kw) [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] raise exceptions.translate_fault(task_info.error) [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Faults: ['InvalidArgument'] [ 847.725464] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] [ 847.725464] env[68492]: INFO nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Terminating instance [ 847.726526] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.726736] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 847.727300] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.727459] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.727623] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 847.728547] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e5002420-352a-48ae-acea-67b46dc3b89e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.738438] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 847.738609] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 847.739596] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-785e5159-57ea-4de5-957d-0a9a9dd45b98 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.744889] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 847.744889] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524dfec2-5551-1a28-5b66-bc3f28bdf44b" [ 847.744889] env[68492]: _type = "Task" [ 847.744889] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.752623] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524dfec2-5551-1a28-5b66-bc3f28bdf44b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 847.755827] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 847.820577] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.830157] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Releasing lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.830560] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 847.830780] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 847.831855] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5adc262-636a-49a0-8d59-1cec91068d8d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.839985] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 847.840432] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b68a01b2-fa71-4b39-80ba-1d668ec1338b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.868488] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 847.868740] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 847.868883] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Deleting the datastore file [datastore2] 5446b198-82c9-4a57-92e8-ffcf3c37be0d {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 847.869158] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-39c924b1-b677-44b1-8355-b1bd74f9b6ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.875488] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for the task: (returnval){ [ 847.875488] env[68492]: value = "task-3395378" [ 847.875488] env[68492]: _type = "Task" [ 847.875488] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.883356] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Task: {'id': task-3395378, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 848.254761] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 848.255076] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating directory with path [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 848.255311] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ab1eb1e1-28a1-4fe3-b2cb-8c06cc932d5b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.267747] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created directory with path [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 848.267949] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Fetch image to [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 848.268130] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 848.268873] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2676872-2ea1-4d01-b05b-a38a8e644095 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.275763] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18dcec55-394b-4c35-9025-007777577921 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.286781] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b8e10fd-2072-4704-86b5-ec8cb1321dcf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.317649] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4bfd82-562f-4c97-9c42-b5d80eed0feb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.323640] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-703499dd-4a57-4d48-9d63-195770e8599d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.387148] env[68492]: DEBUG oslo_vmware.api [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Task: {'id': task-3395378, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.055201} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 848.387412] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 848.387611] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 848.387804] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 848.387979] env[68492]: INFO nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Took 0.56 seconds to destroy the instance on the hypervisor. [ 848.388241] env[68492]: DEBUG oslo.service.loopingcall [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 848.388454] env[68492]: DEBUG nova.compute.manager [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 848.390506] env[68492]: DEBUG nova.compute.claims [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 848.390698] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 848.390919] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 848.414086] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 848.471441] env[68492]: DEBUG oslo_vmware.rw_handles [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 848.532783] env[68492]: DEBUG oslo_vmware.rw_handles [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 848.532983] env[68492]: DEBUG oslo_vmware.rw_handles [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 848.778567] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71d667ea-2ec4-40f1-8b73-58d3a4414574 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.786552] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349d2e5d-ec9f-4796-99d3-ce4ae2cffd3f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.818227] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69a69e97-abaa-4a76-a192-5c89f8d05811 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.825641] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f12137f-751e-4d9b-9bf8-5dae3d629038 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.840862] env[68492]: DEBUG nova.compute.provider_tree [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 848.852345] env[68492]: DEBUG nova.scheduler.client.report [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 848.869892] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.479s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 848.870493] env[68492]: ERROR nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.870493] env[68492]: Faults: ['InvalidArgument'] [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Traceback (most recent call last): [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self.driver.spawn(context, instance, image_meta, [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self._fetch_image_if_missing(context, vi) [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] image_cache(vi, tmp_image_ds_loc) [ 848.870493] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] vm_util.copy_virtual_disk( [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] session._wait_for_task(vmdk_copy_task) [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return self.wait_for_task(task_ref) [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return evt.wait() [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] result = hub.switch() [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] return self.greenlet.switch() [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.870888] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] self.f(*self.args, **self.kw) [ 848.871302] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.871302] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] raise exceptions.translate_fault(task_info.error) [ 848.871302] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.871302] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Faults: ['InvalidArgument'] [ 848.871302] env[68492]: ERROR nova.compute.manager [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] [ 848.871302] env[68492]: DEBUG nova.compute.utils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 848.872957] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Build of instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d was re-scheduled: A specified parameter was not correct: fileType [ 848.872957] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 848.874291] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 848.874291] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.874291] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 848.874291] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 849.073864] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 849.130529] env[68492]: DEBUG nova.network.neutron [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.139258] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Releasing lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.139477] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 849.139655] env[68492]: DEBUG nova.compute.manager [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Skipping network deallocation for instance since networking was not requested. {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 849.232108] env[68492]: INFO nova.scheduler.client.report [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Deleted allocations for instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d [ 849.249811] env[68492]: DEBUG oslo_concurrency.lockutils [None req-70280c9f-187d-4b5a-8c4b-381be615cd01 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 278.636s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.250941] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 78.074s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.251188] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.251392] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.251564] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.253637] env[68492]: INFO nova.compute.manager [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Terminating instance [ 849.255231] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquiring lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 849.255514] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Acquired lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 849.255514] env[68492]: DEBUG nova.network.neutron [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 849.266699] env[68492]: DEBUG nova.compute.manager [None req-fd2470d5-2181-48dc-bdf6-3debc140039a tempest-ServerDiagnosticsV248Test-663931398 tempest-ServerDiagnosticsV248Test-663931398-project-member] [instance: aae38f8c-fe29-478b-946a-1f75bb9434a4] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.289236] env[68492]: DEBUG nova.network.neutron [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 849.302258] env[68492]: DEBUG nova.compute.manager [None req-fd2470d5-2181-48dc-bdf6-3debc140039a tempest-ServerDiagnosticsV248Test-663931398 tempest-ServerDiagnosticsV248Test-663931398-project-member] [instance: aae38f8c-fe29-478b-946a-1f75bb9434a4] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.325014] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fd2470d5-2181-48dc-bdf6-3debc140039a tempest-ServerDiagnosticsV248Test-663931398 tempest-ServerDiagnosticsV248Test-663931398-project-member] Lock "aae38f8c-fe29-478b-946a-1f75bb9434a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.568s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.334585] env[68492]: DEBUG nova.compute.manager [None req-8c636232-a89a-47dc-9e02-e4820174d228 tempest-ServerAddressesTestJSON-565573396 tempest-ServerAddressesTestJSON-565573396-project-member] [instance: e410e6fa-7652-45d1-8ec1-f1c1db5c728f] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.364777] env[68492]: DEBUG nova.compute.manager [None req-8c636232-a89a-47dc-9e02-e4820174d228 tempest-ServerAddressesTestJSON-565573396 tempest-ServerAddressesTestJSON-565573396-project-member] [instance: e410e6fa-7652-45d1-8ec1-f1c1db5c728f] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.369672] env[68492]: DEBUG nova.network.neutron [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.378897] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Releasing lock "refresh_cache-5446b198-82c9-4a57-92e8-ffcf3c37be0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.383032] env[68492]: DEBUG nova.compute.manager [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 849.383141] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 849.383604] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-79be0499-fac9-4ff3-9ab9-79e89bc668b5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.388401] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c636232-a89a-47dc-9e02-e4820174d228 tempest-ServerAddressesTestJSON-565573396 tempest-ServerAddressesTestJSON-565573396-project-member] Lock "e410e6fa-7652-45d1-8ec1-f1c1db5c728f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.321s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.394191] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729f7db6-2380-40cd-81ee-fbfc9e335c34 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.408164] env[68492]: DEBUG nova.compute.manager [None req-cf9c310f-172d-4e95-b4b4-607f3caf131b tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: e7c66cb6-10fc-44d4-9821-6e3141e04024] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.426755] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5446b198-82c9-4a57-92e8-ffcf3c37be0d could not be found. [ 849.426950] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.427135] env[68492]: INFO nova.compute.manager [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 849.427375] env[68492]: DEBUG oslo.service.loopingcall [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 849.427600] env[68492]: DEBUG nova.compute.manager [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 849.427693] env[68492]: DEBUG nova.network.neutron [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 849.436291] env[68492]: DEBUG nova.compute.manager [None req-cf9c310f-172d-4e95-b4b4-607f3caf131b tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: e7c66cb6-10fc-44d4-9821-6e3141e04024] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.445259] env[68492]: DEBUG nova.network.neutron [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 849.453025] env[68492]: DEBUG nova.network.neutron [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.462019] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cf9c310f-172d-4e95-b4b4-607f3caf131b tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "e7c66cb6-10fc-44d4-9821-6e3141e04024" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.760s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.463473] env[68492]: INFO nova.compute.manager [-] [instance: 5446b198-82c9-4a57-92e8-ffcf3c37be0d] Took 0.04 seconds to deallocate network for instance. [ 849.472656] env[68492]: DEBUG nova.compute.manager [None req-5fe4e49e-cce3-469c-b74c-6e44c83ce18c tempest-ImagesNegativeTestJSON-1217222349 tempest-ImagesNegativeTestJSON-1217222349-project-member] [instance: 31f0fab8-123f-4857-93a7-517ac44dbf9d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.498944] env[68492]: DEBUG nova.compute.manager [None req-5fe4e49e-cce3-469c-b74c-6e44c83ce18c tempest-ImagesNegativeTestJSON-1217222349 tempest-ImagesNegativeTestJSON-1217222349-project-member] [instance: 31f0fab8-123f-4857-93a7-517ac44dbf9d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.518949] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fe4e49e-cce3-469c-b74c-6e44c83ce18c tempest-ImagesNegativeTestJSON-1217222349 tempest-ImagesNegativeTestJSON-1217222349-project-member] Lock "31f0fab8-123f-4857-93a7-517ac44dbf9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.734s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.529744] env[68492]: DEBUG nova.compute.manager [None req-49587811-2a76-4767-8140-91ad086366cc tempest-FloatingIPsAssociationNegativeTestJSON-1547245369 tempest-FloatingIPsAssociationNegativeTestJSON-1547245369-project-member] [instance: d720fc20-a7a6-4826-9174-2fb12bb0a6c1] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.567381] env[68492]: DEBUG nova.compute.manager [None req-49587811-2a76-4767-8140-91ad086366cc tempest-FloatingIPsAssociationNegativeTestJSON-1547245369 tempest-FloatingIPsAssociationNegativeTestJSON-1547245369-project-member] [instance: d720fc20-a7a6-4826-9174-2fb12bb0a6c1] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.582735] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4d18fcf-5c51-4f77-84ab-0a4f18a4ac35 tempest-ServersAdmin275Test-611248155 tempest-ServersAdmin275Test-611248155-project-member] Lock "5446b198-82c9-4a57-92e8-ffcf3c37be0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.332s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.594879] env[68492]: DEBUG oslo_concurrency.lockutils [None req-49587811-2a76-4767-8140-91ad086366cc tempest-FloatingIPsAssociationNegativeTestJSON-1547245369 tempest-FloatingIPsAssociationNegativeTestJSON-1547245369-project-member] Lock "d720fc20-a7a6-4826-9174-2fb12bb0a6c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.110s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.606595] env[68492]: DEBUG nova.compute.manager [None req-083cddbc-c6cc-4246-bd73-59984fcd3343 tempest-FloatingIPsAssociationTestJSON-485227705 tempest-FloatingIPsAssociationTestJSON-485227705-project-member] [instance: 2590f6bd-a48f-49ad-b955-a0ebec9d31e3] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.632039] env[68492]: DEBUG nova.compute.manager [None req-083cddbc-c6cc-4246-bd73-59984fcd3343 tempest-FloatingIPsAssociationTestJSON-485227705 tempest-FloatingIPsAssociationTestJSON-485227705-project-member] [instance: 2590f6bd-a48f-49ad-b955-a0ebec9d31e3] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.652336] env[68492]: DEBUG oslo_concurrency.lockutils [None req-083cddbc-c6cc-4246-bd73-59984fcd3343 tempest-FloatingIPsAssociationTestJSON-485227705 tempest-FloatingIPsAssociationTestJSON-485227705-project-member] Lock "2590f6bd-a48f-49ad-b955-a0ebec9d31e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.713s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.663484] env[68492]: DEBUG nova.compute.manager [None req-53f0b282-5d2d-4456-82ee-fca2cd7c3ca8 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] [instance: 9d15dfea-323f-4007-91cb-0a0b64d60a5e] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.686353] env[68492]: DEBUG nova.compute.manager [None req-53f0b282-5d2d-4456-82ee-fca2cd7c3ca8 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] [instance: 9d15dfea-323f-4007-91cb-0a0b64d60a5e] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.707706] env[68492]: DEBUG oslo_concurrency.lockutils [None req-53f0b282-5d2d-4456-82ee-fca2cd7c3ca8 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Lock "9d15dfea-323f-4007-91cb-0a0b64d60a5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.527s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.717220] env[68492]: DEBUG nova.compute.manager [None req-513f315a-28ad-46ef-b482-909fc804883e tempest-AttachInterfacesV270Test-472283853 tempest-AttachInterfacesV270Test-472283853-project-member] [instance: 81d59156-2869-4045-a2d3-349e6077f477] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.740500] env[68492]: DEBUG nova.compute.manager [None req-513f315a-28ad-46ef-b482-909fc804883e tempest-AttachInterfacesV270Test-472283853 tempest-AttachInterfacesV270Test-472283853-project-member] [instance: 81d59156-2869-4045-a2d3-349e6077f477] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.763675] env[68492]: DEBUG oslo_concurrency.lockutils [None req-513f315a-28ad-46ef-b482-909fc804883e tempest-AttachInterfacesV270Test-472283853 tempest-AttachInterfacesV270Test-472283853-project-member] Lock "81d59156-2869-4045-a2d3-349e6077f477" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.457s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.772296] env[68492]: DEBUG nova.compute.manager [None req-3c0ebeb4-da4b-4dca-9428-37df29488a3e tempest-ServerGroupTestJSON-859793356 tempest-ServerGroupTestJSON-859793356-project-member] [instance: 1ee59a29-0ef7-4906-a027-90992418c3fb] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.795130] env[68492]: DEBUG nova.compute.manager [None req-3c0ebeb4-da4b-4dca-9428-37df29488a3e tempest-ServerGroupTestJSON-859793356 tempest-ServerGroupTestJSON-859793356-project-member] [instance: 1ee59a29-0ef7-4906-a027-90992418c3fb] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 849.817479] env[68492]: DEBUG oslo_concurrency.lockutils [None req-3c0ebeb4-da4b-4dca-9428-37df29488a3e tempest-ServerGroupTestJSON-859793356 tempest-ServerGroupTestJSON-859793356-project-member] Lock "1ee59a29-0ef7-4906-a027-90992418c3fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.794s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.828137] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 849.893038] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.893309] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.894832] env[68492]: INFO nova.compute.claims [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 850.299021] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11b133c2-58f8-4889-8897-038ab359b9b8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.307277] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58580fa2-370e-4c17-b159-3ff31299ca94 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.343419] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00409248-fc93-490b-a7a2-cbda8f201d4e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.353524] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7683b9c-13f6-4b61-a8a5-a149d5b1eb0c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.368435] env[68492]: DEBUG nova.compute.provider_tree [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.380759] env[68492]: DEBUG nova.scheduler.client.report [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.394019] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.501s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.394502] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 850.435538] env[68492]: DEBUG nova.compute.utils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 850.436845] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 850.437034] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 850.447621] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 850.513976] env[68492]: DEBUG nova.policy [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '721b0fa31ea449c88dc7dcf86ab7b74c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '514e008c899841c2ae6cd90a3519df72', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 850.524437] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 850.557170] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 850.557532] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 850.557751] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 850.557983] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 850.558192] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 850.558385] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 850.558633] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 850.558851] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 850.559052] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 850.559260] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 850.559495] env[68492]: DEBUG nova.virt.hardware [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 850.560796] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25649977-5c31-470b-a784-25bfdc061240 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.569767] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4dc483e-4b7d-4266-b467-0bdbfce8f391 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.946338] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Successfully created port: 883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 851.588328] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Successfully updated port: 883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 851.608620] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 851.608976] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 851.608976] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 851.628513] env[68492]: DEBUG nova.compute.manager [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Received event network-vif-plugged-883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 851.628880] env[68492]: DEBUG oslo_concurrency.lockutils [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] Acquiring lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 851.628929] env[68492]: DEBUG oslo_concurrency.lockutils [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 851.629213] env[68492]: DEBUG oslo_concurrency.lockutils [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 851.629307] env[68492]: DEBUG nova.compute.manager [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] No waiting events found dispatching network-vif-plugged-883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 851.629426] env[68492]: WARNING nova.compute.manager [req-0b0f95e6-9bf8-4816-bd1f-d8d8e5ff8ac3 req-958924c6-a66f-40ad-b9f0-30d6c19652c5 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Received unexpected event network-vif-plugged-883d3edc-5159-468a-b0d0-6f6b476873ba for instance with vm_state building and task_state spawning. [ 851.659496] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 851.843940] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Updating instance_info_cache with network_info: [{"id": "883d3edc-5159-468a-b0d0-6f6b476873ba", "address": "fa:16:3e:d2:b8:ee", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap883d3edc-51", "ovs_interfaceid": "883d3edc-5159-468a-b0d0-6f6b476873ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 851.859327] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 851.859796] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance network_info: |[{"id": "883d3edc-5159-468a-b0d0-6f6b476873ba", "address": "fa:16:3e:d2:b8:ee", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap883d3edc-51", "ovs_interfaceid": "883d3edc-5159-468a-b0d0-6f6b476873ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 851.862147] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d2:b8:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '11b669be-fb26-4ef8-bdb6-c77ab9d06daf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '883d3edc-5159-468a-b0d0-6f6b476873ba', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 851.869524] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating folder: Project (514e008c899841c2ae6cd90a3519df72). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.870228] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7a0824ce-04c8-4bcd-8427-2e87cd78df48 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.880954] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created folder: Project (514e008c899841c2ae6cd90a3519df72) in parent group-v677434. [ 851.881875] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating folder: Instances. Parent ref: group-v677478. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.883189] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a74b82e-0d2b-45f7-8a61-c358792c36d0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.892759] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created folder: Instances in parent group-v677478. [ 851.893223] env[68492]: DEBUG oslo.service.loopingcall [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 851.893566] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 851.893969] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a8f5e14b-5ba7-4c2f-bb34-7c4472b27ca8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.917176] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 851.917176] env[68492]: value = "task-3395381" [ 851.917176] env[68492]: _type = "Task" [ 851.917176] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 851.923579] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395381, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 852.426685] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395381, 'name': CreateVM_Task, 'duration_secs': 0.284319} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 852.426921] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 852.430852] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 852.431079] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 852.431364] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 852.431628] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2db02f06-ad9d-40a0-8cc9-d6e06cbafff8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.436424] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 852.436424] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52897f7e-3472-41b5-b628-24232210e637" [ 852.436424] env[68492]: _type = "Task" [ 852.436424] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 852.443999] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52897f7e-3472-41b5-b628-24232210e637, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 852.950221] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 852.950221] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 852.950221] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 853.674826] env[68492]: DEBUG nova.compute.manager [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Received event network-changed-883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 853.675128] env[68492]: DEBUG nova.compute.manager [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Refreshing instance network info cache due to event network-changed-883d3edc-5159-468a-b0d0-6f6b476873ba. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 853.675381] env[68492]: DEBUG oslo_concurrency.lockutils [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] Acquiring lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 853.675381] env[68492]: DEBUG oslo_concurrency.lockutils [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] Acquired lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 853.675488] env[68492]: DEBUG nova.network.neutron [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Refreshing network info cache for port 883d3edc-5159-468a-b0d0-6f6b476873ba {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 853.986255] env[68492]: DEBUG nova.network.neutron [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Updated VIF entry in instance network info cache for port 883d3edc-5159-468a-b0d0-6f6b476873ba. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 853.986609] env[68492]: DEBUG nova.network.neutron [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Updating instance_info_cache with network_info: [{"id": "883d3edc-5159-468a-b0d0-6f6b476873ba", "address": "fa:16:3e:d2:b8:ee", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap883d3edc-51", "ovs_interfaceid": "883d3edc-5159-468a-b0d0-6f6b476873ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 854.007706] env[68492]: DEBUG oslo_concurrency.lockutils [req-2276c077-b307-408e-a1b4-5dddac7f5c7c req-2cc97dce-3b8b-47f0-822a-219d1958b128 service nova] Releasing lock "refresh_cache-3b1ce4e1-bbad-4030-84d9-f814a44eec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 854.691667] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.251015] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 873.230784] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 874.704960] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.705309] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 875.225800] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.230403] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.240632] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 875.240632] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 875.240632] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 875.240821] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 875.241887] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd957d5-3ce8-4562-bf98-dc8c022f6b7c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.250921] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8cdc90a-8a50-43c7-a441-5331e03856cc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.264882] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b73b3cef-4e56-4a63-a93a-7f4bd19500ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.271092] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5c2eb24-ea81-4841-b8fa-a0f72cc4005a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.301130] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 875.301273] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 875.301469] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 875.374625] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f3c94673-a8fc-4ead-9907-4347cd6244ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.374787] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.374915] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375054] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375181] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375299] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375412] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375529] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375644] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.375758] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 875.387320] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.397893] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance eae1ea40-8ebd-4b7a-9489-e0e70653a517 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.408789] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.419810] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f4669ef-c7da-4f9a-9ebe-83947f00863a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.429112] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 09401266-1c03-4c2e-b850-e7196bcb1e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.438095] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.448090] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f48567a8-6b74-46ee-af6b-37823323e17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.457645] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a59a286e-ad8c-4628-b326-09762dea3534 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.466701] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2598cded-78b6-4230-98c5-7068b429a56c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.475862] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab820eba-d4d5-4b07-bc68-79c4b8cf46c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.484946] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0de36474-6ab2-4c5c-a85c-5080d82b3f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.494060] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49db2997-6ee3-4cbd-b640-77ad352ae2fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.503431] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance dacc9b15-d2d0-4d7e-b419-eff947683f42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.512586] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b8f3a42e-9412-408f-bbbc-2d7a542bd82e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.522198] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fc27ef4a-0a1d-49c7-b96d-5a57810117bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.531850] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 875.532181] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 875.532340] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 875.821236] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e95dbea-ccbb-40aa-98a8-206fdb4200a0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.830024] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3da8ea83-922e-4bef-b23c-92f4a3c48a28 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.859633] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8256976-1a5e-453e-89d2-8d2b6d546eb8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.867138] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802684c2-83af-4e54-be4e-6974d56eab8c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.880202] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 875.888686] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 875.905123] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 875.905123] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 876.905067] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.905067] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 876.905067] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 876.924269] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.924419] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.924545] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.924917] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.924917] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.924917] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.925100] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.925137] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.925250] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.925365] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 876.925485] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 876.925949] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.926153] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.926286] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 877.230935] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.230676] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.231047] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 897.432094] env[68492]: WARNING oslo_vmware.rw_handles [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 897.432094] env[68492]: ERROR oslo_vmware.rw_handles [ 897.432751] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 897.434408] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 897.434656] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Copying Virtual Disk [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/cee9072a-86c7-4ddf-9de3-1ef65d946317/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 897.434934] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-81c1bc45-21d0-4cc0-a455-476c33d1c81e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.442240] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 897.442240] env[68492]: value = "task-3395393" [ 897.442240] env[68492]: _type = "Task" [ 897.442240] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 897.451367] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395393, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 897.952332] env[68492]: DEBUG oslo_vmware.exceptions [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 897.952633] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 897.953234] env[68492]: ERROR nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.953234] env[68492]: Faults: ['InvalidArgument'] [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Traceback (most recent call last): [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] yield resources [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self.driver.spawn(context, instance, image_meta, [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self._fetch_image_if_missing(context, vi) [ 897.953234] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] image_cache(vi, tmp_image_ds_loc) [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] vm_util.copy_virtual_disk( [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] session._wait_for_task(vmdk_copy_task) [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return self.wait_for_task(task_ref) [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return evt.wait() [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] result = hub.switch() [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 897.953891] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return self.greenlet.switch() [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self.f(*self.args, **self.kw) [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] raise exceptions.translate_fault(task_info.error) [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Faults: ['InvalidArgument'] [ 897.954391] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] [ 897.954391] env[68492]: INFO nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Terminating instance [ 897.955099] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 897.955351] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.955547] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-44b2acaa-8a33-4f63-a8ff-f2ed66ec5503 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.957770] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 897.957968] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 897.958718] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6ca3587-50ad-4529-8901-008d07ae4a9d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.965539] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 897.965777] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-29351a82-408f-473a-8b29-933bcd9cbee7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.967938] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.968126] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 897.969050] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2529865e-e56e-45e5-b2b4-4a8d8616436c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.973599] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for the task: (returnval){ [ 897.973599] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52af7034-8579-4474-038b-a8e00e7bd64d" [ 897.973599] env[68492]: _type = "Task" [ 897.973599] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 897.985231] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52af7034-8579-4474-038b-a8e00e7bd64d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 898.042212] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 898.042422] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 898.042625] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleting the datastore file [datastore2] f3c94673-a8fc-4ead-9907-4347cd6244ba {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 898.042914] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bac1ad53-cbfa-4c3e-adfa-b0922e192c4f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.048396] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 898.048396] env[68492]: value = "task-3395395" [ 898.048396] env[68492]: _type = "Task" [ 898.048396] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 898.055931] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395395, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 898.485162] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 898.486960] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Creating directory with path [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 898.487365] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9de92613-b058-4745-b8b7-02e8e8b253d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.503092] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Created directory with path [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 898.503092] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Fetch image to [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 898.503092] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 898.503092] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-053325ec-2f1c-4784-b332-b602d555ccfb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.509854] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0a5cb7a-b742-43d8-8b14-0e63d99889e1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.519125] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c85dc13-0e3b-49b3-bafe-984499add0cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.553948] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f546e6d-0431-4725-ada4-fd3f3e519bb4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.560643] env[68492]: DEBUG oslo_vmware.api [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395395, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074882} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 898.562205] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 898.562449] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 898.562664] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 898.562877] env[68492]: INFO nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Took 0.60 seconds to destroy the instance on the hypervisor. [ 898.564747] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5fa7b63e-1c61-4795-861b-01f1c4211687 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.566685] env[68492]: DEBUG nova.compute.claims [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 898.566905] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 898.567176] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 898.590942] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 898.708783] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 898.774281] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 898.774493] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 898.985359] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07cc264e-055d-4ab8-9df8-a0e5c007b7ac {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.992664] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bafca835-862c-4599-ab71-ed97a56ba212 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.021717] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6647521c-aa38-4c76-b321-6f717c3074e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.028682] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db82afa-03b7-43d5-8762-442f41a2724e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.041893] env[68492]: DEBUG nova.compute.provider_tree [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 899.050809] env[68492]: DEBUG nova.scheduler.client.report [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 899.065121] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.498s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 899.065659] env[68492]: ERROR nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 899.065659] env[68492]: Faults: ['InvalidArgument'] [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Traceback (most recent call last): [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self.driver.spawn(context, instance, image_meta, [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self._fetch_image_if_missing(context, vi) [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] image_cache(vi, tmp_image_ds_loc) [ 899.065659] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] vm_util.copy_virtual_disk( [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] session._wait_for_task(vmdk_copy_task) [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return self.wait_for_task(task_ref) [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return evt.wait() [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] result = hub.switch() [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] return self.greenlet.switch() [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 899.066094] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] self.f(*self.args, **self.kw) [ 899.066526] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 899.066526] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] raise exceptions.translate_fault(task_info.error) [ 899.066526] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 899.066526] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Faults: ['InvalidArgument'] [ 899.066526] env[68492]: ERROR nova.compute.manager [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] [ 899.066526] env[68492]: DEBUG nova.compute.utils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 899.067799] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Build of instance f3c94673-a8fc-4ead-9907-4347cd6244ba was re-scheduled: A specified parameter was not correct: fileType [ 899.067799] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 899.068220] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 899.068412] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 899.068567] env[68492]: DEBUG nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 899.068732] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 899.389085] env[68492]: DEBUG nova.network.neutron [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.406262] env[68492]: INFO nova.compute.manager [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Took 0.34 seconds to deallocate network for instance. [ 899.503275] env[68492]: INFO nova.scheduler.client.report [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleted allocations for instance f3c94673-a8fc-4ead-9907-4347cd6244ba [ 899.524722] env[68492]: DEBUG oslo_concurrency.lockutils [None req-65bbe703-99a0-40d6-b39d-c69fe5a6f7df tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 324.827s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 899.525861] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 125.592s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 899.526095] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 899.526302] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 899.526856] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 899.528380] env[68492]: INFO nova.compute.manager [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Terminating instance [ 899.530175] env[68492]: DEBUG nova.compute.manager [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 899.530368] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 899.530818] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-046447df-d2cc-4126-b5fa-7090890d0042 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.540997] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a32fff4d-077a-4b2f-9681-0d02a9288b00 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.554093] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 899.576487] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f3c94673-a8fc-4ead-9907-4347cd6244ba could not be found. [ 899.576721] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 899.576906] env[68492]: INFO nova.compute.manager [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Took 0.05 seconds to destroy the instance on the hypervisor. [ 899.577784] env[68492]: DEBUG oslo.service.loopingcall [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 899.577784] env[68492]: DEBUG nova.compute.manager [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 899.577784] env[68492]: DEBUG nova.network.neutron [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 899.605251] env[68492]: DEBUG nova.network.neutron [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.612644] env[68492]: INFO nova.compute.manager [-] [instance: f3c94673-a8fc-4ead-9907-4347cd6244ba] Took 0.04 seconds to deallocate network for instance. [ 899.617638] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 899.617870] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 899.619294] env[68492]: INFO nova.compute.claims [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 899.711531] env[68492]: DEBUG oslo_concurrency.lockutils [None req-134fbe31-a152-4e77-a725-e4ba833ce401 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "f3c94673-a8fc-4ead-9907-4347cd6244ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 900.026319] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc8a4adc-a124-4245-960e-8d755cab786f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.034081] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac73630-dbdd-4733-92c8-c34561b77f86 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.064386] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b709e6-a35c-4283-a63a-fed0ed03cc65 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.071460] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a57bed-b9e6-44d0-8fad-e473ba540249 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.084534] env[68492]: DEBUG nova.compute.provider_tree [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 900.112477] env[68492]: DEBUG nova.scheduler.client.report [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 900.126870] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.509s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 900.127461] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 900.158603] env[68492]: DEBUG nova.compute.utils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 900.160052] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 900.160052] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 900.173123] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 900.231932] env[68492]: DEBUG nova.policy [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4e1492c4ec848fd9c4cd177987363b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29aa609a5781465cbe2bab5e72a4a590', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 900.239657] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 900.262065] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 900.262300] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 900.262452] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 900.262625] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 900.262763] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 900.262903] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 900.263148] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 900.263319] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 900.263483] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 900.263640] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 900.263806] env[68492]: DEBUG nova.virt.hardware [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 900.264947] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa8565e-76e5-4d3b-98ec-993ce8bbaaa7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.273467] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd25428-7641-4885-8375-b85e920f1917 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.549379] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Successfully created port: 3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 900.712518] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 900.712746] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.158627] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Successfully updated port: 3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 901.173273] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 901.173744] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquired lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 901.175108] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 901.211471] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 901.387255] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Updating instance_info_cache with network_info: [{"id": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "address": "fa:16:3e:2b:55:a2", "network": {"id": "3ce68f6f-197c-4193-9cdb-49e483bfd140", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-569035776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29aa609a5781465cbe2bab5e72a4a590", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f6d1427-d86b-4371-9172-50e4bb0eb1cb", "external-id": "nsx-vlan-transportzone-979", "segmentation_id": 979, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a166122-0a", "ovs_interfaceid": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 901.406157] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Releasing lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 901.406459] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance network_info: |[{"id": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "address": "fa:16:3e:2b:55:a2", "network": {"id": "3ce68f6f-197c-4193-9cdb-49e483bfd140", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-569035776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29aa609a5781465cbe2bab5e72a4a590", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f6d1427-d86b-4371-9172-50e4bb0eb1cb", "external-id": "nsx-vlan-transportzone-979", "segmentation_id": 979, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a166122-0a", "ovs_interfaceid": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 901.406893] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2b:55:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f6d1427-d86b-4371-9172-50e4bb0eb1cb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a166122-0ae0-433f-91fd-5fa9202ffcc9', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 901.414577] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Creating folder: Project (29aa609a5781465cbe2bab5e72a4a590). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 901.415114] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-995b61c1-1456-4cf2-b2c4-5cceae787881 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.426623] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Created folder: Project (29aa609a5781465cbe2bab5e72a4a590) in parent group-v677434. [ 901.426916] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Creating folder: Instances. Parent ref: group-v677485. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 901.427160] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e2786265-8140-407d-b160-ece934fc4fd1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.436547] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Created folder: Instances in parent group-v677485. [ 901.436787] env[68492]: DEBUG oslo.service.loopingcall [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 901.436944] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 901.437169] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5e7ecbfb-cbc4-4556-96cf-7f4bc3c630dc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.458514] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 901.458514] env[68492]: value = "task-3395398" [ 901.458514] env[68492]: _type = "Task" [ 901.458514] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 901.466260] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395398, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 901.500134] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "569b49ff-047a-4494-b869-6598764da9d7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.696992] env[68492]: DEBUG nova.compute.manager [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Received event network-vif-plugged-3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 901.697361] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Acquiring lock "569b49ff-047a-4494-b869-6598764da9d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.697561] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Lock "569b49ff-047a-4494-b869-6598764da9d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.697803] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Lock "569b49ff-047a-4494-b869-6598764da9d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 901.697996] env[68492]: DEBUG nova.compute.manager [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] No waiting events found dispatching network-vif-plugged-3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 901.698181] env[68492]: WARNING nova.compute.manager [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Received unexpected event network-vif-plugged-3a166122-0ae0-433f-91fd-5fa9202ffcc9 for instance with vm_state building and task_state deleting. [ 901.698346] env[68492]: DEBUG nova.compute.manager [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Received event network-changed-3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 901.698504] env[68492]: DEBUG nova.compute.manager [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Refreshing instance network info cache due to event network-changed-3a166122-0ae0-433f-91fd-5fa9202ffcc9. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 901.698688] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Acquiring lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 901.698823] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Acquired lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 901.698979] env[68492]: DEBUG nova.network.neutron [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Refreshing network info cache for port 3a166122-0ae0-433f-91fd-5fa9202ffcc9 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 901.955180] env[68492]: DEBUG nova.network.neutron [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Updated VIF entry in instance network info cache for port 3a166122-0ae0-433f-91fd-5fa9202ffcc9. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 901.955629] env[68492]: DEBUG nova.network.neutron [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Updating instance_info_cache with network_info: [{"id": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "address": "fa:16:3e:2b:55:a2", "network": {"id": "3ce68f6f-197c-4193-9cdb-49e483bfd140", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-569035776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29aa609a5781465cbe2bab5e72a4a590", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f6d1427-d86b-4371-9172-50e4bb0eb1cb", "external-id": "nsx-vlan-transportzone-979", "segmentation_id": 979, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a166122-0a", "ovs_interfaceid": "3a166122-0ae0-433f-91fd-5fa9202ffcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 901.967894] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395398, 'name': CreateVM_Task, 'duration_secs': 0.355959} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 901.968598] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 901.969043] env[68492]: DEBUG oslo_concurrency.lockutils [req-de567864-7126-4858-867b-e10df894a83d req-a4eefa6d-0217-40b1-90ad-9e4fb6d5b734 service nova] Releasing lock "refresh_cache-569b49ff-047a-4494-b869-6598764da9d7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 901.969804] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 901.969960] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 901.970285] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 901.970742] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ffb50b90-9662-4575-9d43-3ae12c9f03d9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.975186] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for the task: (returnval){ [ 901.975186] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5265c0db-6586-d674-612f-8d28a7b80ea4" [ 901.975186] env[68492]: _type = "Task" [ 901.975186] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 901.982823] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5265c0db-6586-d674-612f-8d28a7b80ea4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 902.486155] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 902.486414] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 902.486606] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.232766] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.233083] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.233083] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 936.233083] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 936.254742] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.255372] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.255687] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.255950] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.256218] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.256540] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.256816] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.257283] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.257283] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.257602] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 936.258295] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 937.230785] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.231049] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.231239] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.243429] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.243429] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.243794] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 937.243794] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 937.244932] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f680958d-7490-4dd3-8558-5a3182504dae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.253980] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db7486c0-7593-4478-81e3-f6fde5c1c27e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.268441] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94f3f239-4516-4fa1-90a0-693901bbf9a0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.275084] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-299afa32-5a70-4c3a-a125-a2bc51b8b5f5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.304129] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180927MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 937.304299] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.304509] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.385703] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b7e0d1c7-d21b-42c1-b400-86be946df689 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.385915] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386075] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386195] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386317] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386437] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386548] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.386694] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.387733] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.387733] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 937.398534] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance eae1ea40-8ebd-4b7a-9489-e0e70653a517 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.408113] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.425394] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f4669ef-c7da-4f9a-9ebe-83947f00863a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.435119] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 09401266-1c03-4c2e-b850-e7196bcb1e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.445776] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.456025] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f48567a8-6b74-46ee-af6b-37823323e17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.465032] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a59a286e-ad8c-4628-b326-09762dea3534 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.475209] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2598cded-78b6-4230-98c5-7068b429a56c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.487690] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab820eba-d4d5-4b07-bc68-79c4b8cf46c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.497983] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0de36474-6ab2-4c5c-a85c-5080d82b3f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.506493] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49db2997-6ee3-4cbd-b640-77ad352ae2fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.516696] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance dacc9b15-d2d0-4d7e-b419-eff947683f42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.528029] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b8f3a42e-9412-408f-bbbc-2d7a542bd82e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.537747] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fc27ef4a-0a1d-49c7-b96d-5a57810117bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.548030] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.558415] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 937.558645] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 937.558789] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 937.840734] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0604d7-cbc6-4bf6-b84c-6ff90c298893 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.848432] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d693cf7e-a28a-4481-995a-cbbe5472c2c6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.879017] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-511521dd-15a5-4381-99f4-3f8f988fa442 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.884989] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b85e4b-d535-4d8d-a9f6-c2bafaad13d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.897382] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 937.905808] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 937.922591] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 937.922768] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 938.923144] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.923520] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 939.231940] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 939.232204] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.231473] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 947.235885] env[68492]: WARNING oslo_vmware.rw_handles [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 947.235885] env[68492]: ERROR oslo_vmware.rw_handles [ 947.236445] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 947.238097] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 947.238341] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Copying Virtual Disk [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/45beeb5a-fae6-4d71-94fb-9d0db0d044c9/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 947.238633] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d85dbe70-bf3e-4b00-ab2f-fb318bd7db0a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.247899] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for the task: (returnval){ [ 947.247899] env[68492]: value = "task-3395399" [ 947.247899] env[68492]: _type = "Task" [ 947.247899] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.256280] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Task: {'id': task-3395399, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.758126] env[68492]: DEBUG oslo_vmware.exceptions [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 947.758412] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 947.758954] env[68492]: ERROR nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.758954] env[68492]: Faults: ['InvalidArgument'] [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Traceback (most recent call last): [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] yield resources [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self.driver.spawn(context, instance, image_meta, [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self._fetch_image_if_missing(context, vi) [ 947.758954] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] image_cache(vi, tmp_image_ds_loc) [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] vm_util.copy_virtual_disk( [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] session._wait_for_task(vmdk_copy_task) [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return self.wait_for_task(task_ref) [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return evt.wait() [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] result = hub.switch() [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 947.759257] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return self.greenlet.switch() [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self.f(*self.args, **self.kw) [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] raise exceptions.translate_fault(task_info.error) [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Faults: ['InvalidArgument'] [ 947.759573] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] [ 947.759573] env[68492]: INFO nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Terminating instance [ 947.760814] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 947.761814] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.761814] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 947.761924] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 947.762149] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f1a9463-4e64-43ec-bc68-051fd7aeb710 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.764680] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a30bf43-45bc-4f87-83c3-28064e8500bd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.772749] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 947.773583] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-34bbec87-687d-4974-b091-14b04b84c0ad {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.775031] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.775212] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 947.776043] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1e165bc6-4281-4d5c-97bd-21ba60a06fb5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.781360] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for the task: (returnval){ [ 947.781360] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e3dcf8-3d4c-271d-e4e3-967b7a93b37f" [ 947.781360] env[68492]: _type = "Task" [ 947.781360] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.793517] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e3dcf8-3d4c-271d-e4e3-967b7a93b37f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.847849] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 947.848096] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 947.848282] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Deleting the datastore file [datastore2] b7e0d1c7-d21b-42c1-b400-86be946df689 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 947.848553] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ef40ea08-0f5b-4d5c-95ae-0955c4787d9a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.855015] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for the task: (returnval){ [ 947.855015] env[68492]: value = "task-3395401" [ 947.855015] env[68492]: _type = "Task" [ 947.855015] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.863151] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Task: {'id': task-3395401, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 948.299199] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 948.299531] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Creating directory with path [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.299703] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac2dc07a-da5a-4481-ab8d-d9615ea8c486 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.312855] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Created directory with path [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.313343] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Fetch image to [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 948.313343] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 948.315022] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5249b8a-f277-400c-82ee-e6bbb2370cb3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.323481] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-165e135f-9687-4394-9b89-3266d27e4f6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.333066] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ddcd4f-4291-4779-ac12-c6f8baa5747d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.368997] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29e54e35-9876-4298-99f0-83987a4d3d23 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.376741] env[68492]: DEBUG oslo_vmware.api [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Task: {'id': task-3395401, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 948.378244] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.378436] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.378610] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.378807] env[68492]: INFO nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Took 0.62 seconds to destroy the instance on the hypervisor. [ 948.380818] env[68492]: DEBUG nova.compute.claims [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 948.380996] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.381227] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 948.383649] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-64e5ba2e-ced9-47cc-8a4d-10f1a272bc13 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.405258] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 948.486596] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 948.566579] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 948.566870] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 948.823916] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9025af8-9092-4675-8f96-bde29d28a892 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.831503] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d45c021-d8b1-4712-a93f-db4644d0ac7f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.861442] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5381d225-b078-4637-8f14-7c821a2f6fa4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.868804] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20fbe2d7-e578-4db4-bac9-d59a6f9c51ba {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.882132] env[68492]: DEBUG nova.compute.provider_tree [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 948.890034] env[68492]: DEBUG nova.scheduler.client.report [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 948.910552] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.529s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.911211] env[68492]: ERROR nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.911211] env[68492]: Faults: ['InvalidArgument'] [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Traceback (most recent call last): [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self.driver.spawn(context, instance, image_meta, [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self._fetch_image_if_missing(context, vi) [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] image_cache(vi, tmp_image_ds_loc) [ 948.911211] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] vm_util.copy_virtual_disk( [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] session._wait_for_task(vmdk_copy_task) [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return self.wait_for_task(task_ref) [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return evt.wait() [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] result = hub.switch() [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] return self.greenlet.switch() [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 948.913439] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] self.f(*self.args, **self.kw) [ 948.913821] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 948.913821] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] raise exceptions.translate_fault(task_info.error) [ 948.913821] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.913821] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Faults: ['InvalidArgument'] [ 948.913821] env[68492]: ERROR nova.compute.manager [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] [ 948.913821] env[68492]: DEBUG nova.compute.utils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.913821] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Build of instance b7e0d1c7-d21b-42c1-b400-86be946df689 was re-scheduled: A specified parameter was not correct: fileType [ 948.913821] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 948.914031] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 948.914031] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 948.914031] env[68492]: DEBUG nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 948.914188] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 949.336541] env[68492]: DEBUG nova.network.neutron [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.353864] env[68492]: INFO nova.compute.manager [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Took 0.44 seconds to deallocate network for instance. [ 949.458578] env[68492]: INFO nova.scheduler.client.report [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Deleted allocations for instance b7e0d1c7-d21b-42c1-b400-86be946df689 [ 949.481461] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4fa6f8e-9555-4ce6-a09b-e2c587cfe934 tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 373.776s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.482722] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 175.665s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.483437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Acquiring lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 949.483437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.483437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.487720] env[68492]: INFO nova.compute.manager [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Terminating instance [ 949.489856] env[68492]: DEBUG nova.compute.manager [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 949.490148] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 949.490330] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2849f51b-bb71-4045-9833-521ed1128023 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.499118] env[68492]: DEBUG nova.compute.manager [None req-0d3f650b-ef47-4541-be9f-32f35f198681 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: eae1ea40-8ebd-4b7a-9489-e0e70653a517] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 949.506806] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d366ea35-d629-4c40-a7a3-04ecef04b46f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.538208] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b7e0d1c7-d21b-42c1-b400-86be946df689 could not be found. [ 949.538424] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 949.538602] env[68492]: INFO nova.compute.manager [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Took 0.05 seconds to destroy the instance on the hypervisor. [ 949.538848] env[68492]: DEBUG oslo.service.loopingcall [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 949.539271] env[68492]: DEBUG nova.compute.manager [None req-0d3f650b-ef47-4541-be9f-32f35f198681 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] [instance: eae1ea40-8ebd-4b7a-9489-e0e70653a517] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 949.540173] env[68492]: DEBUG nova.compute.manager [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 949.540280] env[68492]: DEBUG nova.network.neutron [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 949.565479] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0d3f650b-ef47-4541-be9f-32f35f198681 tempest-DeleteServersAdminTestJSON-1009414491 tempest-DeleteServersAdminTestJSON-1009414491-project-member] Lock "eae1ea40-8ebd-4b7a-9489-e0e70653a517" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.805s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.569169] env[68492]: DEBUG nova.network.neutron [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.577064] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 949.580910] env[68492]: INFO nova.compute.manager [-] [instance: b7e0d1c7-d21b-42c1-b400-86be946df689] Took 0.04 seconds to deallocate network for instance. [ 949.628457] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 949.628788] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.630240] env[68492]: INFO nova.compute.claims [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 949.699732] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b26d1c42-e42b-4b58-823e-e3f569e068ee tempest-ServersAdminNegativeTestJSON-642803759 tempest-ServersAdminNegativeTestJSON-642803759-project-member] Lock "b7e0d1c7-d21b-42c1-b400-86be946df689" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.217s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 950.021155] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6554c79d-78c5-4719-9944-3c3ced5c786f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.029378] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25c6155-d692-4c4e-a8e2-c3cd319fe822 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.066563] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bee62cc4-decf-45ca-b233-a2e1bc54b91c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.074129] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-210f573c-c5ce-4116-a692-40fcc674e495 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.087151] env[68492]: DEBUG nova.compute.provider_tree [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 950.098288] env[68492]: DEBUG nova.scheduler.client.report [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 950.114867] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.486s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 950.115407] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 950.152845] env[68492]: DEBUG nova.compute.utils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 950.153649] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 950.153818] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 950.167478] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 950.215907] env[68492]: DEBUG nova.policy [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e6cc485c02243bea6188efa2f792a01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc95b77eecea47c081122f8452ea71a1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 950.236027] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 950.267126] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 950.267377] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 950.267536] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 950.267745] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 950.267927] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 950.268237] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 950.268487] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 950.268659] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 950.268851] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 950.268977] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 950.269307] env[68492]: DEBUG nova.virt.hardware [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 950.270744] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af441001-dd8a-46fc-96bc-f784086f3552 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.283295] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-640491ff-ed89-4dda-aee6-0fd712163d8b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.531094] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Successfully created port: 05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 951.429023] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Successfully updated port: 05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 951.443465] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 951.443642] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquired lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 951.443932] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 951.489893] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 951.661611] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Updating instance_info_cache with network_info: [{"id": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "address": "fa:16:3e:34:de:78", "network": {"id": "881ae598-9395-4f70-98f5-809bb2d63c40", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1276312899-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc95b77eecea47c081122f8452ea71a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f976bf-e6", "ovs_interfaceid": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.674510] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Releasing lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 951.674780] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance network_info: |[{"id": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "address": "fa:16:3e:34:de:78", "network": {"id": "881ae598-9395-4f70-98f5-809bb2d63c40", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1276312899-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc95b77eecea47c081122f8452ea71a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f976bf-e6", "ovs_interfaceid": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 951.675184] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:de:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5446413d-c3b0-4cd2-a962-62240db178ac', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '05f976bf-e618-414c-bc66-fcfcfd8df05f', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 951.684011] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Creating folder: Project (dc95b77eecea47c081122f8452ea71a1). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 951.684766] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f393448a-79b8-4d2c-9fac-bac888bfa66c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.697492] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Created folder: Project (dc95b77eecea47c081122f8452ea71a1) in parent group-v677434. [ 951.697698] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Creating folder: Instances. Parent ref: group-v677488. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 951.697916] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9bd7b90f-b08b-4bd7-a95f-ee3ad6e0ad27 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.706536] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Created folder: Instances in parent group-v677488. [ 951.706786] env[68492]: DEBUG oslo.service.loopingcall [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 951.707043] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 951.707212] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2c3a2503-1bf2-4dd7-a47e-649bfa15e010 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.728000] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 951.728000] env[68492]: value = "task-3395404" [ 951.728000] env[68492]: _type = "Task" [ 951.728000] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 951.737150] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395404, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 951.976656] env[68492]: DEBUG nova.compute.manager [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Received event network-vif-plugged-05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 951.976656] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Acquiring lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.976656] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.976656] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.976819] env[68492]: DEBUG nova.compute.manager [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] No waiting events found dispatching network-vif-plugged-05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 951.976934] env[68492]: WARNING nova.compute.manager [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Received unexpected event network-vif-plugged-05f976bf-e618-414c-bc66-fcfcfd8df05f for instance with vm_state building and task_state spawning. [ 951.978310] env[68492]: DEBUG nova.compute.manager [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Received event network-changed-05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 951.978587] env[68492]: DEBUG nova.compute.manager [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Refreshing instance network info cache due to event network-changed-05f976bf-e618-414c-bc66-fcfcfd8df05f. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 951.978858] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Acquiring lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 951.979097] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Acquired lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 951.980401] env[68492]: DEBUG nova.network.neutron [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Refreshing network info cache for port 05f976bf-e618-414c-bc66-fcfcfd8df05f {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 952.239025] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395404, 'name': CreateVM_Task, 'duration_secs': 0.288341} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 952.239274] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 952.240020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 952.240243] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 952.240651] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 952.241287] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43d1f512-a22f-4f52-9fca-c53bd15f3c7d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.246045] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for the task: (returnval){ [ 952.246045] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52eef52b-00c1-29f9-14e7-fadd05266e3c" [ 952.246045] env[68492]: _type = "Task" [ 952.246045] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 952.255868] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52eef52b-00c1-29f9-14e7-fadd05266e3c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 952.256721] env[68492]: DEBUG nova.network.neutron [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Updated VIF entry in instance network info cache for port 05f976bf-e618-414c-bc66-fcfcfd8df05f. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 952.258149] env[68492]: DEBUG nova.network.neutron [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Updating instance_info_cache with network_info: [{"id": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "address": "fa:16:3e:34:de:78", "network": {"id": "881ae598-9395-4f70-98f5-809bb2d63c40", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1276312899-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc95b77eecea47c081122f8452ea71a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5446413d-c3b0-4cd2-a962-62240db178ac", "external-id": "nsx-vlan-transportzone-528", "segmentation_id": 528, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05f976bf-e6", "ovs_interfaceid": "05f976bf-e618-414c-bc66-fcfcfd8df05f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.273623] env[68492]: DEBUG oslo_concurrency.lockutils [req-f8ceb984-35ec-4c6b-8f8f-adac9d8cf887 req-a05b1f73-43a1-4e87-849d-e3de1831c617 service nova] Releasing lock "refresh_cache-8c72085d-697c-4829-866a-4d642f18d2f6" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 952.758776] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 952.758776] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 952.758776] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 954.465022] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "913d527c-f9f8-43da-b539-d1e2e2b71528" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.465022] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 957.024457] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 957.025045] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.169442] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "acbc1e36-0803-44ff-8ebc-094083193bc4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 980.320346] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 980.321265] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 992.226609] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 993.107487] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "8c72085d-697c-4829-866a-4d642f18d2f6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.231015] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 994.440414] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4edb98c-865a-44f7-8205-7703fad07800 tempest-ImagesOneServerNegativeTestJSON-1666722731 tempest-ImagesOneServerNegativeTestJSON-1666722731-project-member] Acquiring lock "5c5946ea-9bda-4c9c-80cb-e8a580b74148" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.441210] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4edb98c-865a-44f7-8205-7703fad07800 tempest-ImagesOneServerNegativeTestJSON-1666722731 tempest-ImagesOneServerNegativeTestJSON-1666722731-project-member] Lock "5c5946ea-9bda-4c9c-80cb-e8a580b74148" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.652792] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d9802bbb-f996-4d5b-916e-ae83961094c7 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "d5f69f3c-ef44-462e-817a-3258de5f5ff3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.653064] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d9802bbb-f996-4d5b-916e-ae83961094c7 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "d5f69f3c-ef44-462e-817a-3258de5f5ff3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 996.232557] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 996.232739] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 996.232862] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 996.254940] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255119] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255253] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255379] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255501] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255636] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255730] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255849] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.255963] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.256089] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 996.256210] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 996.567512] env[68492]: DEBUG oslo_concurrency.lockutils [None req-28ac2215-fec8-4cf0-85aa-cec8c31ae2e8 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "37f2e678-b217-4bf3-83e6-74d85ee8a446" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 996.567740] env[68492]: DEBUG oslo_concurrency.lockutils [None req-28ac2215-fec8-4cf0-85aa-cec8c31ae2e8 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "37f2e678-b217-4bf3-83e6-74d85ee8a446" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 997.230735] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.231115] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.231115] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.243291] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 997.243510] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 997.243677] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 997.243851] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 997.244999] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69b5699b-fbfb-49a2-9e53-e297989f969f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.253568] env[68492]: WARNING oslo_vmware.rw_handles [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 997.253568] env[68492]: ERROR oslo_vmware.rw_handles [ 997.253978] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 997.255765] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 997.256059] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Copying Virtual Disk [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/433affa2-ae7b-4470-81a0-fd4b891ca5aa/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 997.257213] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-312c4bc3-3ea3-408f-bcdd-d33525ade47e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.261047] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3d0fd58d-0e0c-4145-b88b-a62bd652a025 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.274395] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-362a7fe8-9491-4938-90b2-e0dfbb83737d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.276833] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for the task: (returnval){ [ 997.276833] env[68492]: value = "task-3395405" [ 997.276833] env[68492]: _type = "Task" [ 997.276833] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.282726] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32747ad6-4384-4775-a7e6-3431f1013358 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.288465] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Task: {'id': task-3395405, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 997.318389] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 997.318563] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 997.318791] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 997.393716] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 12450355-d90e-40dc-b66f-6105ec320d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.393937] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance acbc1e36-0803-44ff-8ebc-094083193bc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394107] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394238] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394361] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394481] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394600] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394724] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394851] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.394966] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 997.407106] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 09401266-1c03-4c2e-b850-e7196bcb1e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.417380] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.429780] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f48567a8-6b74-46ee-af6b-37823323e17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.439583] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a59a286e-ad8c-4628-b326-09762dea3534 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.448924] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2598cded-78b6-4230-98c5-7068b429a56c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.458053] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ab820eba-d4d5-4b07-bc68-79c4b8cf46c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.467144] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0de36474-6ab2-4c5c-a85c-5080d82b3f8e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.476376] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49db2997-6ee3-4cbd-b640-77ad352ae2fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.486662] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance dacc9b15-d2d0-4d7e-b419-eff947683f42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.496860] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b8f3a42e-9412-408f-bbbc-2d7a542bd82e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.506289] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fc27ef4a-0a1d-49c7-b96d-5a57810117bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.515859] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.525864] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.539236] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.549843] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.559360] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.568654] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5c5946ea-9bda-4c9c-80cb-e8a580b74148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.578069] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 37f2e678-b217-4bf3-83e6-74d85ee8a446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 997.578351] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 997.578531] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 997.790168] env[68492]: DEBUG oslo_vmware.exceptions [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 997.790168] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 997.790168] env[68492]: ERROR nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.790168] env[68492]: Faults: ['InvalidArgument'] [ 997.790168] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Traceback (most recent call last): [ 997.790168] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 997.790168] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] yield resources [ 997.790168] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self.driver.spawn(context, instance, image_meta, [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self._fetch_image_if_missing(context, vi) [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] image_cache(vi, tmp_image_ds_loc) [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] vm_util.copy_virtual_disk( [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] session._wait_for_task(vmdk_copy_task) [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 997.790494] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return self.wait_for_task(task_ref) [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return evt.wait() [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] result = hub.switch() [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return self.greenlet.switch() [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self.f(*self.args, **self.kw) [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] raise exceptions.translate_fault(task_info.error) [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Faults: ['InvalidArgument'] [ 997.790825] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] [ 997.791193] env[68492]: INFO nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Terminating instance [ 997.792445] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 997.792638] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 997.795062] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 997.795268] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 997.796044] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-935ca442-cebd-4516-ba54-9348731da90c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.799830] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e920e782-f37a-429f-bcbf-b2bc135f016e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.806711] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 997.807974] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a4a516d1-5676-4668-929e-e50df4e0faed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.809804] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 997.809982] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 997.812891] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82f328bc-45d4-42ea-ad11-ab3b441d906f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.818992] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 997.818992] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52062a2a-97c1-9668-2fa3-b2ea2eb0494c" [ 997.818992] env[68492]: _type = "Task" [ 997.818992] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.826479] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52062a2a-97c1-9668-2fa3-b2ea2eb0494c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 997.886299] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 997.886519] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 997.886703] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Deleting the datastore file [datastore2] 12450355-d90e-40dc-b66f-6105ec320d19 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 997.886944] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f20b809c-f79f-452c-a5a1-9bd03e56a6f4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.893610] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for the task: (returnval){ [ 997.893610] env[68492]: value = "task-3395407" [ 997.893610] env[68492]: _type = "Task" [ 997.893610] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.903664] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Task: {'id': task-3395407, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 997.919295] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd554c3-d501-437e-9a07-15d2d7e60158 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.926523] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c39386ae-ec5e-4819-961d-b04aebe4947e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.956439] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84a9a4e8-800e-4eb9-b052-d0290c1516bf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.963889] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f19614b-27f7-43b2-a61c-76b6ec58bbc0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.977191] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 997.987784] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 998.004125] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 998.004331] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 998.330254] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 998.330673] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating directory with path [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 998.330866] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc83407e-ceb1-4683-b3c6-2252caca6ed0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.342953] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created directory with path [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 998.343157] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Fetch image to [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 998.343335] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 998.344114] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d049443-5f64-4c15-842e-df61cb6763b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.350876] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88cfb8ad-4219-4d98-9c8b-5032a940bf46 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.360649] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e37aaa51-4a52-4ef5-b392-18fc34a277d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.391623] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b322123-bfa5-4988-93cb-4cbd58f35217 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.400670] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b8a7318c-6724-417b-860d-f81e09160d64 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.407211] env[68492]: DEBUG oslo_vmware.api [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Task: {'id': task-3395407, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079079} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 998.407445] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 998.407623] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 998.407790] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 998.407974] env[68492]: INFO nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Took 0.62 seconds to destroy the instance on the hypervisor. [ 998.410188] env[68492]: DEBUG nova.compute.claims [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 998.410361] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 998.410573] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.425622] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 998.479288] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 998.543182] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 998.543386] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 998.835504] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8f99461d-48bc-4adc-b558-823ed4a0b541 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "888dac8e-013f-4024-9fa7-4cc13c361268" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 998.835741] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8f99461d-48bc-4adc-b558-823ed4a0b541 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "888dac8e-013f-4024-9fa7-4cc13c361268" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.836538] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b134a2e1-b456-420f-bd14-2c02fba76f7a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.844711] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9d51ff5-7b91-4d94-8848-dfc11dc47221 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.875385] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db81543b-d74e-487f-984b-c70e3a727f6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.883239] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3b1a39-e9dd-4a2f-89b9-0c82ee8ab757 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.897096] env[68492]: DEBUG nova.compute.provider_tree [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 998.905603] env[68492]: DEBUG nova.scheduler.client.report [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 998.919858] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.509s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 998.920333] env[68492]: ERROR nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.920333] env[68492]: Faults: ['InvalidArgument'] [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Traceback (most recent call last): [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self.driver.spawn(context, instance, image_meta, [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self._fetch_image_if_missing(context, vi) [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] image_cache(vi, tmp_image_ds_loc) [ 998.920333] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] vm_util.copy_virtual_disk( [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] session._wait_for_task(vmdk_copy_task) [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return self.wait_for_task(task_ref) [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return evt.wait() [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] result = hub.switch() [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] return self.greenlet.switch() [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 998.920619] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] self.f(*self.args, **self.kw) [ 998.920948] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 998.920948] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] raise exceptions.translate_fault(task_info.error) [ 998.920948] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.920948] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Faults: ['InvalidArgument'] [ 998.920948] env[68492]: ERROR nova.compute.manager [instance: 12450355-d90e-40dc-b66f-6105ec320d19] [ 998.921074] env[68492]: DEBUG nova.compute.utils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 998.922833] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Build of instance 12450355-d90e-40dc-b66f-6105ec320d19 was re-scheduled: A specified parameter was not correct: fileType [ 998.922833] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 998.923221] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 998.923390] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 998.923557] env[68492]: DEBUG nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 998.923752] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 999.235046] env[68492]: DEBUG nova.network.neutron [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.250244] env[68492]: INFO nova.compute.manager [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Took 0.33 seconds to deallocate network for instance. [ 999.366942] env[68492]: INFO nova.scheduler.client.report [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Deleted allocations for instance 12450355-d90e-40dc-b66f-6105ec320d19 [ 999.387193] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b63921a-c3cb-4ea9-97ea-2d7a2c371bbc tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 422.345s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.388699] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 222.914s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.388919] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Acquiring lock "12450355-d90e-40dc-b66f-6105ec320d19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.389146] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.389312] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.391681] env[68492]: INFO nova.compute.manager [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Terminating instance [ 999.393696] env[68492]: DEBUG nova.compute.manager [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 999.393696] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 999.394052] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-97b1d620-a75b-4812-9a42-9aa941f324d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.403992] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b1e41a8-3c50-4d97-8eec-a24cd9f2a698 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.414848] env[68492]: DEBUG nova.compute.manager [None req-5fa220eb-d4a4-41c9-a4c2-e897af89ef90 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4f4669ef-c7da-4f9a-9ebe-83947f00863a] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 999.438796] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 12450355-d90e-40dc-b66f-6105ec320d19 could not be found. [ 999.439015] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 999.439205] env[68492]: INFO nova.compute.manager [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Took 0.05 seconds to destroy the instance on the hypervisor. [ 999.439445] env[68492]: DEBUG oslo.service.loopingcall [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 999.439841] env[68492]: DEBUG nova.compute.manager [None req-5fa220eb-d4a4-41c9-a4c2-e897af89ef90 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4f4669ef-c7da-4f9a-9ebe-83947f00863a] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 999.441170] env[68492]: DEBUG nova.compute.manager [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 999.441303] env[68492]: DEBUG nova.network.neutron [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 999.459711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5fa220eb-d4a4-41c9-a4c2-e897af89ef90 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4f4669ef-c7da-4f9a-9ebe-83947f00863a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.257s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.463270] env[68492]: DEBUG nova.network.neutron [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.467640] env[68492]: DEBUG nova.compute.manager [None req-c4c57657-212f-4931-a6fb-6f36858f9df1 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 09401266-1c03-4c2e-b850-e7196bcb1e9d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 999.471078] env[68492]: INFO nova.compute.manager [-] [instance: 12450355-d90e-40dc-b66f-6105ec320d19] Took 0.03 seconds to deallocate network for instance. [ 999.488925] env[68492]: DEBUG nova.compute.manager [None req-c4c57657-212f-4931-a6fb-6f36858f9df1 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 09401266-1c03-4c2e-b850-e7196bcb1e9d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 999.507317] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c4c57657-212f-4931-a6fb-6f36858f9df1 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "09401266-1c03-4c2e-b850-e7196bcb1e9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.433s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.515507] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 999.555009] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0e157327-421c-43e8-9865-77ef6d85a445 tempest-VolumesAssistedSnapshotsTest-1763465267 tempest-VolumesAssistedSnapshotsTest-1763465267-project-member] Lock "12450355-d90e-40dc-b66f-6105ec320d19" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.576076] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.576362] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.577868] env[68492]: INFO nova.compute.claims [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 999.924756] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a34331ac-3cb4-41d7-a14a-2eb73b36aee2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.933318] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d386006-ec12-4bd9-8b76-8ffe50c0e3b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.967162] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91cc2fab-f3b3-40a6-a484-ac3005e45030 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.974272] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0d1edfc-67e7-4922-99df-5fe1bbe71491 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.987291] env[68492]: DEBUG nova.compute.provider_tree [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 999.995628] env[68492]: DEBUG nova.scheduler.client.report [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1000.003547] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.013205] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.436s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1000.013205] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1000.048967] env[68492]: DEBUG nova.compute.utils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1000.048967] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1000.048967] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1000.056792] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1000.119094] env[68492]: DEBUG nova.policy [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '14a862566f5d435cb4e5fd2506bd3de6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7cdd3c098a147a9a352af49da3e7561', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1000.128126] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1000.156280] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1000.156525] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1000.156680] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1000.156858] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1000.157017] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1000.157257] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1000.157513] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1000.157682] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1000.157851] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1000.158033] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1000.158215] env[68492]: DEBUG nova.virt.hardware [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1000.159125] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42880b05-4cc7-4fa7-913f-c437781742e4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.167708] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e558218-481e-4b8a-ba1f-f43653325aa0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.230746] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.230960] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.231119] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1000.441152] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Successfully created port: 1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1001.000620] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Successfully updated port: 1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1001.016681] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1001.016837] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquired lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1001.016988] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1001.059407] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1001.224049] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Updating instance_info_cache with network_info: [{"id": "1f912bb0-9ba8-4355-932a-7a351915e43b", "address": "fa:16:3e:d6:c7:78", "network": {"id": "c77ee2df-46e1-43db-8621-d3828f824869", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1433659065-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7cdd3c098a147a9a352af49da3e7561", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f912bb0-9b", "ovs_interfaceid": "1f912bb0-9ba8-4355-932a-7a351915e43b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.231159] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.238095] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Releasing lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1001.238435] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance network_info: |[{"id": "1f912bb0-9ba8-4355-932a-7a351915e43b", "address": "fa:16:3e:d6:c7:78", "network": {"id": "c77ee2df-46e1-43db-8621-d3828f824869", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1433659065-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7cdd3c098a147a9a352af49da3e7561", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f912bb0-9b", "ovs_interfaceid": "1f912bb0-9ba8-4355-932a-7a351915e43b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1001.238825] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:c7:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6eb7e3e9-5cc2-40f1-a6eb-f70f06531667', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1f912bb0-9ba8-4355-932a-7a351915e43b', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1001.246805] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Creating folder: Project (b7cdd3c098a147a9a352af49da3e7561). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1001.247331] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dc0bcb89-0d7a-4545-9ce7-f81de1b3c2ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.257313] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Created folder: Project (b7cdd3c098a147a9a352af49da3e7561) in parent group-v677434. [ 1001.257498] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Creating folder: Instances. Parent ref: group-v677491. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1001.257716] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-86cfe32e-d0c8-480c-bd31-28cd698c5782 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.266609] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Created folder: Instances in parent group-v677491. [ 1001.266836] env[68492]: DEBUG oslo.service.loopingcall [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1001.267029] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1001.267225] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-96a7202e-c32f-45c2-9b85-fbbeb4e07acf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.286167] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1001.286167] env[68492]: value = "task-3395410" [ 1001.286167] env[68492]: _type = "Task" [ 1001.286167] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1001.293845] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395410, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1001.359337] env[68492]: DEBUG nova.compute.manager [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Received event network-vif-plugged-1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1001.359479] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Acquiring lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.359721] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.359907] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.360632] env[68492]: DEBUG nova.compute.manager [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] No waiting events found dispatching network-vif-plugged-1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1001.360834] env[68492]: WARNING nova.compute.manager [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Received unexpected event network-vif-plugged-1f912bb0-9ba8-4355-932a-7a351915e43b for instance with vm_state building and task_state deleting. [ 1001.361010] env[68492]: DEBUG nova.compute.manager [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Received event network-changed-1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1001.361179] env[68492]: DEBUG nova.compute.manager [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Refreshing instance network info cache due to event network-changed-1f912bb0-9ba8-4355-932a-7a351915e43b. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1001.361381] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Acquiring lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1001.361543] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Acquired lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1001.361736] env[68492]: DEBUG nova.network.neutron [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Refreshing network info cache for port 1f912bb0-9ba8-4355-932a-7a351915e43b {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1001.378048] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.795516] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395410, 'name': CreateVM_Task, 'duration_secs': 0.287634} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1001.795778] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1001.796362] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1001.796527] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1001.796840] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1001.797103] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8b9e2e27-ce31-4279-bab4-b1851a5f473c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.801351] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for the task: (returnval){ [ 1001.801351] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e2e5ac-62f7-2518-268a-5dbf8f97d6cd" [ 1001.801351] env[68492]: _type = "Task" [ 1001.801351] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1001.808859] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e2e5ac-62f7-2518-268a-5dbf8f97d6cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1001.811668] env[68492]: DEBUG nova.network.neutron [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Updated VIF entry in instance network info cache for port 1f912bb0-9ba8-4355-932a-7a351915e43b. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1001.812026] env[68492]: DEBUG nova.network.neutron [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Updating instance_info_cache with network_info: [{"id": "1f912bb0-9ba8-4355-932a-7a351915e43b", "address": "fa:16:3e:d6:c7:78", "network": {"id": "c77ee2df-46e1-43db-8621-d3828f824869", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1433659065-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7cdd3c098a147a9a352af49da3e7561", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f912bb0-9b", "ovs_interfaceid": "1f912bb0-9ba8-4355-932a-7a351915e43b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.820963] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f19618c-62c1-497b-933d-6f1f06b13332 req-8dfa5d5f-e1cf-49d2-93b3-71f8ada17c2f service nova] Releasing lock "refresh_cache-bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1002.311911] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1002.312206] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1002.312419] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1008.161702] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1008.162438] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.715823] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c32eff8-b21b-4192-ba43-0a04f901898a tempest-ServerShowV254Test-391686084 tempest-ServerShowV254Test-391686084-project-member] Acquiring lock "a6bf3888-5c1a-4a12-85a9-221cbba6457b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.716158] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c32eff8-b21b-4192-ba43-0a04f901898a tempest-ServerShowV254Test-391686084 tempest-ServerShowV254Test-391686084-project-member] Lock "a6bf3888-5c1a-4a12-85a9-221cbba6457b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1017.651297] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b4483b89-80dc-48ce-8ff6-d66c4bfdd20a tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Acquiring lock "2785a54b-6fd5-413d-bdd1-ead082d8777b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1017.651578] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b4483b89-80dc-48ce-8ff6-d66c4bfdd20a tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Lock "2785a54b-6fd5-413d-bdd1-ead082d8777b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1019.868675] env[68492]: DEBUG oslo_concurrency.lockutils [None req-410dc1bf-9835-4db5-8451-2b7d653584bd tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Acquiring lock "2d422f7c-9295-4b08-a623-ae07bacb3e9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1019.868991] env[68492]: DEBUG oslo_concurrency.lockutils [None req-410dc1bf-9835-4db5-8451-2b7d653584bd tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "2d422f7c-9295-4b08-a623-ae07bacb3e9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1033.382405] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc1832ca-ef49-4005-9506-ca15c7b0e976 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Acquiring lock "61d932c3-4c41-4648-b5ee-c083ed425e1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1033.382793] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc1832ca-ef49-4005-9506-ca15c7b0e976 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "61d932c3-4c41-4648-b5ee-c083ed425e1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1037.427619] env[68492]: DEBUG oslo_concurrency.lockutils [None req-109c6c0f-9e3b-4501-99b3-c0860c4ee4a4 tempest-InstanceActionsTestJSON-1991500879 tempest-InstanceActionsTestJSON-1991500879-project-member] Acquiring lock "c9618d2a-72ce-4395-b739-2585861bc446" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1037.427619] env[68492]: DEBUG oslo_concurrency.lockutils [None req-109c6c0f-9e3b-4501-99b3-c0860c4ee4a4 tempest-InstanceActionsTestJSON-1991500879 tempest-InstanceActionsTestJSON-1991500879-project-member] Lock "c9618d2a-72ce-4395-b739-2585861bc446" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1041.941189] env[68492]: DEBUG oslo_concurrency.lockutils [None req-74ebf5f0-6bb4-41b3-876a-2ece4ed79bbc tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Acquiring lock "9bffaa25-3195-4077-a978-6b0dcc4b8ecd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1041.941519] env[68492]: DEBUG oslo_concurrency.lockutils [None req-74ebf5f0-6bb4-41b3-876a-2ece4ed79bbc tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Lock "9bffaa25-3195-4077-a978-6b0dcc4b8ecd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1046.160695] env[68492]: WARNING oslo_vmware.rw_handles [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1046.160695] env[68492]: ERROR oslo_vmware.rw_handles [ 1046.161269] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1046.163186] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1046.163500] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Copying Virtual Disk [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/2168659a-dd01-4d93-a312-2a7bcab8ce1b/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1046.163838] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-00060a38-3c73-4dcf-a577-ccaf36ab1164 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.172068] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 1046.172068] env[68492]: value = "task-3395411" [ 1046.172068] env[68492]: _type = "Task" [ 1046.172068] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1046.180265] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395411, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1046.682507] env[68492]: DEBUG oslo_vmware.exceptions [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1046.682790] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1046.683375] env[68492]: ERROR nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1046.683375] env[68492]: Faults: ['InvalidArgument'] [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Traceback (most recent call last): [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] yield resources [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self.driver.spawn(context, instance, image_meta, [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self._fetch_image_if_missing(context, vi) [ 1046.683375] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] image_cache(vi, tmp_image_ds_loc) [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] vm_util.copy_virtual_disk( [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] session._wait_for_task(vmdk_copy_task) [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return self.wait_for_task(task_ref) [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return evt.wait() [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] result = hub.switch() [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1046.683710] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return self.greenlet.switch() [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self.f(*self.args, **self.kw) [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] raise exceptions.translate_fault(task_info.error) [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Faults: ['InvalidArgument'] [ 1046.684085] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] [ 1046.684085] env[68492]: INFO nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Terminating instance [ 1046.685271] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1046.685489] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1046.685728] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5dda9c9f-907c-42b7-a79b-ca8f0e102708 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.689086] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1046.689346] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1046.690230] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36f8a084-e441-4ea5-bb8e-ebdb2df83e1a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.693949] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1046.694391] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1046.697196] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-765e1093-5496-4585-8c59-a07fe52bb5b3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.699410] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1046.699857] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b4f7b283-9b72-465f-94df-2256b681e948 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.703834] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 1046.703834] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52fb8d95-2045-9b3d-bd01-214561df88d6" [ 1046.703834] env[68492]: _type = "Task" [ 1046.703834] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1046.717915] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52fb8d95-2045-9b3d-bd01-214561df88d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1046.765532] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1046.765751] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1046.765933] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleting the datastore file [datastore2] acbc1e36-0803-44ff-8ebc-094083193bc4 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1046.766229] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ee94a1ca-1972-45fa-bc6f-5d0b1002e8cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.772635] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 1046.772635] env[68492]: value = "task-3395413" [ 1046.772635] env[68492]: _type = "Task" [ 1046.772635] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1046.780098] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395413, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1047.214324] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1047.214675] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating directory with path [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1047.214946] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-95ce9a28-a62c-4661-a6bd-0c5d022ad4df {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.226586] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Created directory with path [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1047.226586] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Fetch image to [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1047.226830] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1047.227516] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef958074-f98b-470d-b627-dadd801293c6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.235638] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfdfd1e2-9867-4217-bdb8-cd238401eaf1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.244565] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90a76d23-d1e3-43d2-ab07-e5b0a1390d50 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.277996] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e5c3d5-a97b-4dd6-9983-363e9ac19ed3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.284785] env[68492]: DEBUG oslo_vmware.api [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395413, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074937} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1047.286284] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1047.286498] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1047.286678] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1047.286851] env[68492]: INFO nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1047.288637] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa9980d5-34d6-4e87-af89-1668f2ad5c2f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.290544] env[68492]: DEBUG nova.compute.claims [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1047.290719] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1047.290933] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1047.312533] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1047.362140] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1047.423937] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1047.424186] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1047.663591] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-245fea7c-d68e-4204-aad8-575a6aec10a8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.671832] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b4ffa61-ada9-4174-aeb5-d8c5e9b7774e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.706136] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90123a9c-3ae8-4267-b01e-fc2ee5da7dbb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.713881] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f40f84c1-2dac-4c4b-8b97-ddab9a3c10e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1047.728400] env[68492]: DEBUG nova.compute.provider_tree [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1047.737076] env[68492]: DEBUG nova.scheduler.client.report [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1047.755183] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.464s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1047.755729] env[68492]: ERROR nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.755729] env[68492]: Faults: ['InvalidArgument'] [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Traceback (most recent call last): [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self.driver.spawn(context, instance, image_meta, [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self._fetch_image_if_missing(context, vi) [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] image_cache(vi, tmp_image_ds_loc) [ 1047.755729] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] vm_util.copy_virtual_disk( [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] session._wait_for_task(vmdk_copy_task) [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return self.wait_for_task(task_ref) [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return evt.wait() [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] result = hub.switch() [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] return self.greenlet.switch() [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1047.756075] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] self.f(*self.args, **self.kw) [ 1047.756385] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1047.756385] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] raise exceptions.translate_fault(task_info.error) [ 1047.756385] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.756385] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Faults: ['InvalidArgument'] [ 1047.756385] env[68492]: ERROR nova.compute.manager [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] [ 1047.756385] env[68492]: DEBUG nova.compute.utils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1047.759157] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Build of instance acbc1e36-0803-44ff-8ebc-094083193bc4 was re-scheduled: A specified parameter was not correct: fileType [ 1047.759157] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1047.759533] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1047.759701] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1047.759891] env[68492]: DEBUG nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1047.760011] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1048.066735] env[68492]: DEBUG nova.network.neutron [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.080562] env[68492]: INFO nova.compute.manager [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Took 0.32 seconds to deallocate network for instance. [ 1048.171034] env[68492]: INFO nova.scheduler.client.report [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleted allocations for instance acbc1e36-0803-44ff-8ebc-094083193bc4 [ 1048.195084] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b31324aa-6bb3-4b21-977f-2e80653e849c tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 468.409s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.196300] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 69.027s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1048.196489] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1048.196693] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1048.196873] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.198877] env[68492]: INFO nova.compute.manager [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Terminating instance [ 1048.202623] env[68492]: DEBUG nova.compute.manager [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1048.202816] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1048.203306] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0cf42789-6f0a-4b8c-a43a-a77cfecf73d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1048.212902] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c6312e-2c63-4681-8647-89ec4e527779 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1048.230634] env[68492]: DEBUG nova.compute.manager [None req-f995aecf-0818-40f8-8b8f-1c361b1202e2 tempest-ServerPasswordTestJSON-1753985612 tempest-ServerPasswordTestJSON-1753985612-project-member] [instance: f48567a8-6b74-46ee-af6b-37823323e17f] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.252340] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance acbc1e36-0803-44ff-8ebc-094083193bc4 could not be found. [ 1048.252549] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.252713] env[68492]: INFO nova.compute.manager [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1048.253124] env[68492]: DEBUG oslo.service.loopingcall [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1048.253238] env[68492]: DEBUG nova.compute.manager [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1048.253325] env[68492]: DEBUG nova.network.neutron [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1048.260051] env[68492]: DEBUG nova.compute.manager [None req-f995aecf-0818-40f8-8b8f-1c361b1202e2 tempest-ServerPasswordTestJSON-1753985612 tempest-ServerPasswordTestJSON-1753985612-project-member] [instance: f48567a8-6b74-46ee-af6b-37823323e17f] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.280785] env[68492]: DEBUG nova.network.neutron [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.289150] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f995aecf-0818-40f8-8b8f-1c361b1202e2 tempest-ServerPasswordTestJSON-1753985612 tempest-ServerPasswordTestJSON-1753985612-project-member] Lock "f48567a8-6b74-46ee-af6b-37823323e17f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.473s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.292706] env[68492]: INFO nova.compute.manager [-] [instance: acbc1e36-0803-44ff-8ebc-094083193bc4] Took 0.04 seconds to deallocate network for instance. [ 1048.302804] env[68492]: DEBUG nova.compute.manager [None req-0a00c346-1b9a-42d6-871e-9b332b1662bc tempest-ServerActionsTestOtherB-352976159 tempest-ServerActionsTestOtherB-352976159-project-member] [instance: a59a286e-ad8c-4628-b326-09762dea3534] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.324710] env[68492]: DEBUG nova.compute.manager [None req-0a00c346-1b9a-42d6-871e-9b332b1662bc tempest-ServerActionsTestOtherB-352976159 tempest-ServerActionsTestOtherB-352976159-project-member] [instance: a59a286e-ad8c-4628-b326-09762dea3534] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.358892] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0a00c346-1b9a-42d6-871e-9b332b1662bc tempest-ServerActionsTestOtherB-352976159 tempest-ServerActionsTestOtherB-352976159-project-member] Lock "a59a286e-ad8c-4628-b326-09762dea3534" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.450s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.366922] env[68492]: DEBUG nova.compute.manager [None req-6813a253-afce-4168-803d-2470c90de818 tempest-ServerActionsTestJSON-1562591659 tempest-ServerActionsTestJSON-1562591659-project-member] [instance: 2598cded-78b6-4230-98c5-7068b429a56c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.399069] env[68492]: DEBUG nova.compute.manager [None req-6813a253-afce-4168-803d-2470c90de818 tempest-ServerActionsTestJSON-1562591659 tempest-ServerActionsTestJSON-1562591659-project-member] [instance: 2598cded-78b6-4230-98c5-7068b429a56c] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.410103] env[68492]: DEBUG oslo_concurrency.lockutils [None req-73ac527c-63f2-44f3-b4c4-74f0fc296104 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "acbc1e36-0803-44ff-8ebc-094083193bc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.214s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.421950] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6813a253-afce-4168-803d-2470c90de818 tempest-ServerActionsTestJSON-1562591659 tempest-ServerActionsTestJSON-1562591659-project-member] Lock "2598cded-78b6-4230-98c5-7068b429a56c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.629s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.429820] env[68492]: DEBUG nova.compute.manager [None req-0978fdad-7b06-4c3e-8104-0e06cce8ca05 tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: ab820eba-d4d5-4b07-bc68-79c4b8cf46c8] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.451410] env[68492]: DEBUG nova.compute.manager [None req-0978fdad-7b06-4c3e-8104-0e06cce8ca05 tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: ab820eba-d4d5-4b07-bc68-79c4b8cf46c8] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.470474] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0978fdad-7b06-4c3e-8104-0e06cce8ca05 tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "ab820eba-d4d5-4b07-bc68-79c4b8cf46c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.341s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.480027] env[68492]: DEBUG nova.compute.manager [None req-2858edaf-85d0-4282-8ac0-4604025c8ef5 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] [instance: 0de36474-6ab2-4c5c-a85c-5080d82b3f8e] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.502442] env[68492]: DEBUG nova.compute.manager [None req-2858edaf-85d0-4282-8ac0-4604025c8ef5 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] [instance: 0de36474-6ab2-4c5c-a85c-5080d82b3f8e] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.522922] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2858edaf-85d0-4282-8ac0-4604025c8ef5 tempest-ServersTestMultiNic-2090640626 tempest-ServersTestMultiNic-2090640626-project-member] Lock "0de36474-6ab2-4c5c-a85c-5080d82b3f8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.266s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.531175] env[68492]: DEBUG nova.compute.manager [None req-bab765e0-3ba8-4cc5-9ca4-2dca7a8387e2 tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] [instance: 49db2997-6ee3-4cbd-b640-77ad352ae2fd] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.552272] env[68492]: DEBUG nova.compute.manager [None req-bab765e0-3ba8-4cc5-9ca4-2dca7a8387e2 tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] [instance: 49db2997-6ee3-4cbd-b640-77ad352ae2fd] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.573038] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bab765e0-3ba8-4cc5-9ca4-2dca7a8387e2 tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Lock "49db2997-6ee3-4cbd-b640-77ad352ae2fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.868s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.581171] env[68492]: DEBUG nova.compute.manager [None req-4cd5f959-82eb-44fd-a937-2a168b111220 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: dacc9b15-d2d0-4d7e-b419-eff947683f42] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.606486] env[68492]: DEBUG nova.compute.manager [None req-4cd5f959-82eb-44fd-a937-2a168b111220 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: dacc9b15-d2d0-4d7e-b419-eff947683f42] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.629558] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4cd5f959-82eb-44fd-a937-2a168b111220 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "dacc9b15-d2d0-4d7e-b419-eff947683f42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.314s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.639707] env[68492]: DEBUG nova.compute.manager [None req-fb138079-a0e7-4e6a-bdf8-fade7e9e07ce tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] [instance: b8f3a42e-9412-408f-bbbc-2d7a542bd82e] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.665486] env[68492]: DEBUG nova.compute.manager [None req-fb138079-a0e7-4e6a-bdf8-fade7e9e07ce tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] [instance: b8f3a42e-9412-408f-bbbc-2d7a542bd82e] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.707234] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fb138079-a0e7-4e6a-bdf8-fade7e9e07ce tempest-ServerRescueNegativeTestJSON-913190447 tempest-ServerRescueNegativeTestJSON-913190447-project-member] Lock "b8f3a42e-9412-408f-bbbc-2d7a542bd82e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.706s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.716255] env[68492]: DEBUG nova.compute.manager [None req-277bafba-e318-4349-bec0-583423586f98 tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] [instance: fc27ef4a-0a1d-49c7-b96d-5a57810117bc] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.740021] env[68492]: DEBUG nova.compute.manager [None req-277bafba-e318-4349-bec0-583423586f98 tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] [instance: fc27ef4a-0a1d-49c7-b96d-5a57810117bc] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1048.760498] env[68492]: DEBUG oslo_concurrency.lockutils [None req-277bafba-e318-4349-bec0-583423586f98 tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Lock "fc27ef4a-0a1d-49c7-b96d-5a57810117bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.493s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1048.769579] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1048.827626] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1048.827879] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1048.829454] env[68492]: INFO nova.compute.claims [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1049.189021] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d26c9cc-91b1-48bb-b62d-aa054e9ddd02 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.195381] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceb9abb9-b9c2-4a50-a004-9e0eb4750672 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.225536] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8d16fa-0bd3-45c1-a530-c5e649274c6e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.232759] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a448c6f-2d61-4efa-bb00-faf02c30a8ab {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.245779] env[68492]: DEBUG nova.compute.provider_tree [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1049.255919] env[68492]: DEBUG nova.scheduler.client.report [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1049.271976] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.444s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1049.272548] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1049.309130] env[68492]: DEBUG nova.compute.utils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1049.310593] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1049.310779] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1049.320473] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1049.360042] env[68492]: INFO nova.virt.block_device [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Booting with volume ea04d3be-c3f1-462d-a8b4-49bbf1089901 at /dev/sda [ 1049.384798] env[68492]: DEBUG nova.policy [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97091e8a008748ff9387fd56b8f4101d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '062703749a694504952ea24ee7eb40db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1049.422019] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-16d0a8d2-1e51-4057-ba23-e36046d7d4fc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.430510] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88935b72-1152-466a-b3b5-f90101c720b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.469360] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b1cf2d1d-a4c4-41ec-a678-98af7abde9b7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.478057] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bcbcdf8-3109-4aed-ae63-260bf1a0ecee {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.508032] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a0036a0-5e5b-44cf-ac20-095a6db64cc8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.515069] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31158a14-5785-463c-af61-ebf9e9570305 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.531034] env[68492]: DEBUG nova.virt.block_device [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating existing volume attachment record: 34cce07e-2795-4de7-9383-e94085e9c451 {{(pid=68492) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1049.785339] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1049.786105] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1049.786316] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1049.786381] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1049.786588] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1049.786777] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1049.787021] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1049.787366] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1049.787471] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1049.787670] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1049.787882] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1049.788102] env[68492]: DEBUG nova.virt.hardware [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1049.789253] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec9840c-8008-467d-893e-54810f1702b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.798175] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-634fdb0c-0663-45d3-ad27-cee0253c7462 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.056193] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Successfully created port: e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1050.736564] env[68492]: DEBUG nova.compute.manager [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Received event network-vif-plugged-e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1050.736818] env[68492]: DEBUG oslo_concurrency.lockutils [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] Acquiring lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1050.736965] env[68492]: DEBUG oslo_concurrency.lockutils [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1050.737232] env[68492]: DEBUG oslo_concurrency.lockutils [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1050.737414] env[68492]: DEBUG nova.compute.manager [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] No waiting events found dispatching network-vif-plugged-e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1050.737628] env[68492]: WARNING nova.compute.manager [req-37663c6a-99bc-4218-abd2-423bf5818571 req-ba821c1d-2e65-491d-939e-a55b16fe4382 service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Received unexpected event network-vif-plugged-e95e473d-6881-40c4-9e71-6c38a271c1ef for instance with vm_state building and task_state spawning. [ 1050.938855] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Successfully updated port: e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1050.949757] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1050.950721] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquired lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1050.950721] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1051.008583] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1051.316335] env[68492]: DEBUG nova.network.neutron [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating instance_info_cache with network_info: [{"id": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "address": "fa:16:3e:06:46:09", "network": {"id": "7812220d-1ad7-4373-b77f-8044664e6228", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1071081306-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "062703749a694504952ea24ee7eb40db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape95e473d-68", "ovs_interfaceid": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1051.337408] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Releasing lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1051.337720] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Instance network_info: |[{"id": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "address": "fa:16:3e:06:46:09", "network": {"id": "7812220d-1ad7-4373-b77f-8044664e6228", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1071081306-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "062703749a694504952ea24ee7eb40db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape95e473d-68", "ovs_interfaceid": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1051.338433] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:46:09', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7edb7c08-2fae-4df5-9ec6-5ccf06d7e337', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e95e473d-6881-40c4-9e71-6c38a271c1ef', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1051.350634] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Creating folder: Project (062703749a694504952ea24ee7eb40db). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1051.351918] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f5ca82e5-1763-4b5e-a34e-b6f49c4063c2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.366084] env[68492]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1051.366270] env[68492]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=68492) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1051.366884] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Folder already exists: Project (062703749a694504952ea24ee7eb40db). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1051.367713] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Creating folder: Instances. Parent ref: group-v677481. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1051.367713] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d3449afd-448c-4d77-96db-b74154b5023e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.376115] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Created folder: Instances in parent group-v677481. [ 1051.376371] env[68492]: DEBUG oslo.service.loopingcall [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1051.376559] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1051.376759] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d809442e-bba0-414a-9061-578c00205406 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.395582] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1051.395582] env[68492]: value = "task-3395416" [ 1051.395582] env[68492]: _type = "Task" [ 1051.395582] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1051.405040] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395416, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1051.905379] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395416, 'name': CreateVM_Task, 'duration_secs': 0.289127} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1051.905772] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1051.906207] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'guest_format': None, 'attachment_id': '34cce07e-2795-4de7-9383-e94085e9c451', 'disk_bus': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677484', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'name': 'volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e3ea0b7a-bc22-4285-bcdd-560c509c09e9', 'attached_at': '', 'detached_at': '', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'serial': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901'}, 'device_type': None, 'boot_index': 0, 'delete_on_termination': True, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=68492) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1051.906432] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Root volume attach. Driver type: vmdk {{(pid=68492) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1051.907217] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e34811f-705e-4381-aaa7-c4072b9d7739 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.916067] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28ca8d8e-eafa-403c-96e0-cf87d9b224de {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.921951] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5625aad4-6903-493b-be31-db1b18d57edf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.928157] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-3e618b13-41d3-4cbe-bb37-7035b7d61990 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.935856] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1051.935856] env[68492]: value = "task-3395417" [ 1051.935856] env[68492]: _type = "Task" [ 1051.935856] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1051.943386] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1052.446114] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 42%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1052.946535] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 54%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1053.226218] env[68492]: DEBUG nova.compute.manager [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Received event network-changed-e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1053.226218] env[68492]: DEBUG nova.compute.manager [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Refreshing instance network info cache due to event network-changed-e95e473d-6881-40c4-9e71-6c38a271c1ef. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1053.229247] env[68492]: DEBUG oslo_concurrency.lockutils [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] Acquiring lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1053.229247] env[68492]: DEBUG oslo_concurrency.lockutils [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] Acquired lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1053.229247] env[68492]: DEBUG nova.network.neutron [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Refreshing network info cache for port e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1053.447761] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 69%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1053.523123] env[68492]: DEBUG nova.network.neutron [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updated VIF entry in instance network info cache for port e95e473d-6881-40c4-9e71-6c38a271c1ef. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1053.523501] env[68492]: DEBUG nova.network.neutron [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating instance_info_cache with network_info: [{"id": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "address": "fa:16:3e:06:46:09", "network": {"id": "7812220d-1ad7-4373-b77f-8044664e6228", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1071081306-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "062703749a694504952ea24ee7eb40db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape95e473d-68", "ovs_interfaceid": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1053.535404] env[68492]: DEBUG oslo_concurrency.lockutils [req-42932d03-26a6-4bf0-ab3f-7f136c769ad9 req-03c9811a-b42b-40b4-8985-e5f074dab1bc service nova] Releasing lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1053.949722] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 84%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1054.449196] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 97%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1054.949279] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 98%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.231320] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1055.449804] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 98%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.811550] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "29397c54-4bb2-4b43-afcb-9969d8dec996" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1055.811754] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1055.951603] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task} progress is 98%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1056.231447] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.231721] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1056.231760] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1056.255223] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255223] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255326] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255383] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255513] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255634] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255750] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255869] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.255984] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.256131] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1056.256256] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1056.373980] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.374274] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1056.450468] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395417, 'name': RelocateVM_Task, 'duration_secs': 4.097956} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1056.450758] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Volume attach. Driver type: vmdk {{(pid=68492) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1056.450951] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677484', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'name': 'volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e3ea0b7a-bc22-4285-bcdd-560c509c09e9', 'attached_at': '', 'detached_at': '', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'serial': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901'} {{(pid=68492) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1056.451731] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d721ea5-7406-4f85-adda-c8deeadebbef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.468555] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f538cd4b-025a-4b6f-a983-17b59b5961f1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.490678] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Reconfiguring VM instance instance-00000033 to attach disk [datastore2] volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901/volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901.vmdk or device None with type thin {{(pid=68492) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1056.490943] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-3f5d0351-4d7b-447b-9216-a78ccdf4cd13 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.511044] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1056.511044] env[68492]: value = "task-3395418" [ 1056.511044] env[68492]: _type = "Task" [ 1056.511044] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1056.519441] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395418, 'name': ReconfigVM_Task} progress is 6%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1057.020584] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395418, 'name': ReconfigVM_Task, 'duration_secs': 0.32328} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1057.020858] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Reconfigured VM instance instance-00000033 to attach disk [datastore2] volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901/volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901.vmdk or device None with type thin {{(pid=68492) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1057.025610] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-a21ea98a-8b85-4304-bfa7-441fa40d5f11 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.039732] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1057.039732] env[68492]: value = "task-3395419" [ 1057.039732] env[68492]: _type = "Task" [ 1057.039732] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.047065] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395419, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1057.230392] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1057.550020] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395419, 'name': ReconfigVM_Task, 'duration_secs': 0.154471} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1057.550390] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677484', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'name': 'volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e3ea0b7a-bc22-4285-bcdd-560c509c09e9', 'attached_at': '', 'detached_at': '', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'serial': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901'} {{(pid=68492) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1057.551169] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-d450845d-227f-4d2c-955a-67839cd055aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.557870] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1057.557870] env[68492]: value = "task-3395420" [ 1057.557870] env[68492]: _type = "Task" [ 1057.557870] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.566472] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395420, 'name': Rename_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.067382] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395420, 'name': Rename_Task, 'duration_secs': 0.12674} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1058.067637] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Powering on the VM {{(pid=68492) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1058.067874] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-df52eebc-3019-430a-8064-40791d0adcde {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.073405] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1058.073405] env[68492]: value = "task-3395421" [ 1058.073405] env[68492]: _type = "Task" [ 1058.073405] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1058.080628] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395421, 'name': PowerOnVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.231460] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1058.242966] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.243279] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.244433] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.244433] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1058.244888] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f435568-4b82-4f38-b164-c6fde86d6ea7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.253714] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c177f36-c270-402c-8e4c-899234d95dbd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.269633] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564e3a03-e608-4f5c-83e9-cbfec331a8cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.276610] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d396ca0-07f0-43e6-8fde-69ed4d6ccda0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.305706] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180979MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1058.305848] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.306066] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.381511] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 14af3749-f031-4543-96e4-af0b4fd28e2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.381703] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.381862] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382015] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382179] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382309] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382448] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382594] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382740] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.382882] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1058.394677] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.409699] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.421233] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.434994] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.450200] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5c5946ea-9bda-4c9c-80cb-e8a580b74148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.461404] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 37f2e678-b217-4bf3-83e6-74d85ee8a446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.472779] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 888dac8e-013f-4024-9fa7-4cc13c361268 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.487662] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.502383] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a6bf3888-5c1a-4a12-85a9-221cbba6457b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.514673] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2785a54b-6fd5-413d-bdd1-ead082d8777b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.526693] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2d422f7c-9295-4b08-a623-ae07bacb3e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.539256] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 61d932c3-4c41-4648-b5ee-c083ed425e1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.553506] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c9618d2a-72ce-4395-b739-2585861bc446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.563911] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9bffaa25-3195-4077-a978-6b0dcc4b8ecd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.578466] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.585308] env[68492]: DEBUG oslo_vmware.api [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395421, 'name': PowerOnVM_Task, 'duration_secs': 0.455097} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1058.586065] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Powered on the VM {{(pid=68492) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1058.586328] env[68492]: INFO nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Took 8.80 seconds to spawn the instance on the hypervisor. [ 1058.586587] env[68492]: DEBUG nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Checking state {{(pid=68492) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1058.587572] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09e0f438-0ad0-48b8-848e-fde9064780da {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.590903] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1058.591149] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1058.591298] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1058.671340] env[68492]: INFO nova.compute.manager [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Took 9.86 seconds to build instance. [ 1058.687969] env[68492]: DEBUG oslo_concurrency.lockutils [None req-460c5db0-7115-448f-9bd2-6a6d80c9c491 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 183.982s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.699483] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1058.750080] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.900065] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8897efea-b717-4b47-879d-e340b625a500 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.907644] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f24b4bc-d200-4dbd-b3e2-129f047d7ed6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.936490] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a4365b-1a31-4f87-9481-bcb3670bee92 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.943195] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9917c569-e151-40a7-b97c-ca674ced5295 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.956368] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1058.964103] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1058.978030] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1058.978030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.671s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1058.978030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.228s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.979909] env[68492]: INFO nova.compute.claims [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1059.280769] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b8c904b-922c-4c2c-8ce6-39400a6a0d9c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.288254] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd31046b-076b-486f-8bad-cd84c9e133aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.326518] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a29c7dfa-d537-4c2b-828a-6c68b7ceac11 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.333956] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f6c2e98-efdd-4ae1-af42-e25cdf3ff62c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.349157] env[68492]: DEBUG nova.compute.provider_tree [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1059.362318] env[68492]: DEBUG nova.scheduler.client.report [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1059.378445] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.401s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.378925] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1059.411793] env[68492]: DEBUG nova.compute.utils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1059.414551] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1059.414724] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1059.424452] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1059.494101] env[68492]: DEBUG nova.policy [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae12ba5644da4cf99138f90612514431', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df5cff4632c44e188abd1b60a3eecedd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1059.497583] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1059.525414] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T15:01:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fdaaef5b-9353-4b24-a292-30d3bcb4448a',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1932089972',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1059.525650] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1059.525806] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1059.525989] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1059.526153] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1059.526375] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1059.526607] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1059.526761] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1059.526939] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1059.527402] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1059.527402] env[68492]: DEBUG nova.virt.hardware [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1059.528205] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6736dbd8-92e2-44e0-99ab-fea486fd8d6a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.536694] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92e6e16-f03d-44a7-9f53-39db798cbd23 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.820394] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Successfully created port: e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1059.976403] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.976694] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.231134] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.415758] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Successfully updated port: e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1060.427197] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1060.427394] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1060.427491] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1060.471603] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1060.530208] env[68492]: DEBUG nova.compute.manager [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Received event network-vif-plugged-e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1060.530484] env[68492]: DEBUG oslo_concurrency.lockutils [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] Acquiring lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1060.530704] env[68492]: DEBUG oslo_concurrency.lockutils [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1060.530871] env[68492]: DEBUG oslo_concurrency.lockutils [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.531380] env[68492]: DEBUG nova.compute.manager [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] No waiting events found dispatching network-vif-plugged-e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1060.531615] env[68492]: WARNING nova.compute.manager [req-fe8fc6f3-80ad-42a7-84da-eb21ff50e6b0 req-52ef6463-c1ec-4149-896b-435fc0b1eb93 service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Received unexpected event network-vif-plugged-e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 for instance with vm_state building and task_state spawning. [ 1060.646023] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Updating instance_info_cache with network_info: [{"id": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "address": "fa:16:3e:d4:9e:21", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8d2ff7c-d2", "ovs_interfaceid": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1060.666186] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1060.666518] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance network_info: |[{"id": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "address": "fa:16:3e:d4:9e:21", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8d2ff7c-d2", "ovs_interfaceid": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1060.668481] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d4:9e:21', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1060.676111] env[68492]: DEBUG oslo.service.loopingcall [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1060.676645] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1060.676885] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5b44e496-5c86-4acd-9588-6fe3e52f5bf1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.698220] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1060.698220] env[68492]: value = "task-3395422" [ 1060.698220] env[68492]: _type = "Task" [ 1060.698220] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1060.706236] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395422, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1060.828864] env[68492]: DEBUG nova.compute.manager [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Received event network-changed-e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1060.830162] env[68492]: DEBUG nova.compute.manager [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Refreshing instance network info cache due to event network-changed-e95e473d-6881-40c4-9e71-6c38a271c1ef. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1060.830162] env[68492]: DEBUG oslo_concurrency.lockutils [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] Acquiring lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1060.830162] env[68492]: DEBUG oslo_concurrency.lockutils [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] Acquired lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1060.831727] env[68492]: DEBUG nova.network.neutron [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Refreshing network info cache for port e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1061.124344] env[68492]: DEBUG nova.network.neutron [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updated VIF entry in instance network info cache for port e95e473d-6881-40c4-9e71-6c38a271c1ef. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1061.124344] env[68492]: DEBUG nova.network.neutron [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating instance_info_cache with network_info: [{"id": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "address": "fa:16:3e:06:46:09", "network": {"id": "7812220d-1ad7-4373-b77f-8044664e6228", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1071081306-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "062703749a694504952ea24ee7eb40db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape95e473d-68", "ovs_interfaceid": "e95e473d-6881-40c4-9e71-6c38a271c1ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1061.137271] env[68492]: DEBUG oslo_concurrency.lockutils [req-08bda904-d906-4aee-855e-dc2cd5b6a374 req-9f9bc1c9-74ea-4cf1-874c-b7c77caf7a8d service nova] Releasing lock "refresh_cache-e3ea0b7a-bc22-4285-bcdd-560c509c09e9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1061.209410] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395422, 'name': CreateVM_Task, 'duration_secs': 0.301759} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1061.209410] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1061.209872] env[68492]: DEBUG oslo_vmware.service [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4f58831-0c8e-4aff-a0a7-9019efea9e90 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.215773] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1061.215983] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1061.216390] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1061.216671] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-47462426-e69e-41b6-a4d8-29e72b82685f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.222501] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 1061.222501] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52efcc1b-ccd1-aa93-471d-49ed0137a675" [ 1061.222501] env[68492]: _type = "Task" [ 1061.222501] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1061.230577] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52efcc1b-ccd1-aa93-471d-49ed0137a675, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1061.230577] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1061.230577] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1061.733238] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1061.733545] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1061.733713] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1061.733859] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquired lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1061.734046] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1061.734310] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0ae751a-3f7a-4e70-817f-4305a911ccfb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.756964] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1061.757166] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1061.757943] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-548a26b0-4a09-4055-a6ec-8ea54e1fef0c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.764016] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-816d998c-ebc6-463a-91e6-b5d7fcdfa1fc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.769190] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 1061.769190] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d62faf-397f-4ac6-40c8-a9fd6d2920d6" [ 1061.769190] env[68492]: _type = "Task" [ 1061.769190] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1061.776319] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d62faf-397f-4ac6-40c8-a9fd6d2920d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.231235] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.279645] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1062.279908] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating directory with path [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1062.280159] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d154ba6c-0d06-4c26-8c79-9bbb58aa98d8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.337891] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Created directory with path [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1062.338117] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Fetch image to [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1062.338293] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore1 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1062.339109] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e0da320-81dc-488e-bd5a-8dad1cad994b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.346379] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83a46f4e-4491-4234-85f2-93c88f76e827 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.355832] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8b23008-a2d1-469d-ada4-dcc4589524c1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.386084] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb0370b-1754-4d09-a092-25cf1a152ec7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.391937] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5ebfd344-f331-4f42-8659-01aff6913273 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.411614] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore1 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1062.462910] env[68492]: DEBUG oslo_vmware.rw_handles [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1062.520805] env[68492]: DEBUG oslo_vmware.rw_handles [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1062.521010] env[68492]: DEBUG oslo_vmware.rw_handles [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1062.586989] env[68492]: DEBUG nova.compute.manager [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Received event network-changed-e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1062.587272] env[68492]: DEBUG nova.compute.manager [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Refreshing instance network info cache due to event network-changed-e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1062.587531] env[68492]: DEBUG oslo_concurrency.lockutils [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] Acquiring lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.587727] env[68492]: DEBUG oslo_concurrency.lockutils [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] Acquired lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.587940] env[68492]: DEBUG nova.network.neutron [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Refreshing network info cache for port e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1063.016708] env[68492]: DEBUG nova.network.neutron [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Updated VIF entry in instance network info cache for port e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1063.017075] env[68492]: DEBUG nova.network.neutron [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Updating instance_info_cache with network_info: [{"id": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "address": "fa:16:3e:d4:9e:21", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape8d2ff7c-d2", "ovs_interfaceid": "e8d2ff7c-d29d-4e1e-9d44-fa0627d077f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1063.027293] env[68492]: DEBUG oslo_concurrency.lockutils [req-d92db67e-5aec-4ee8-9b72-5a5b6f884c14 req-78510cac-fb87-4328-928b-36f521c8785b service nova] Releasing lock "refresh_cache-00387f6d-880b-4a0b-a4be-afb1fe4c844b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1078.468531] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.468531] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.468827] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.468920] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.469045] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1078.470927] env[68492]: INFO nova.compute.manager [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Terminating instance [ 1078.472791] env[68492]: DEBUG nova.compute.manager [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1078.473061] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Powering off the VM {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1078.473441] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-9c4ee9b2-4140-4494-b809-72b58aea4173 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.480990] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1078.480990] env[68492]: value = "task-3395423" [ 1078.480990] env[68492]: _type = "Task" [ 1078.480990] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1078.489077] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395423, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1078.991844] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395423, 'name': PowerOffVM_Task, 'duration_secs': 0.181653} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1078.992137] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Powered off the VM {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1078.992382] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Volume detach. Driver type: vmdk {{(pid=68492) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1078.992583] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677484', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'name': 'volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e3ea0b7a-bc22-4285-bcdd-560c509c09e9', 'attached_at': '', 'detached_at': '', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'serial': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901'} {{(pid=68492) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1078.993362] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c5e34ca-3360-4f4b-9641-04afd95d6899 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.011393] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef50c610-acd3-4668-9a32-62949fed0f09 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.017632] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a7928e-379b-4248-afe8-e6af8a557286 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.034846] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-309eece1-7a63-4382-901d-b2d512bb1703 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.049968] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] The volume has not been displaced from its original location: [datastore2] volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901/volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901.vmdk. No consolidation needed. {{(pid=68492) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1079.056034] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Reconfiguring VM instance instance-00000033 to detach disk 2000 {{(pid=68492) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1079.056034] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-a8ef9ae9-bab8-4548-92fb-5877d4e844e0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.071985] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1079.071985] env[68492]: value = "task-3395424" [ 1079.071985] env[68492]: _type = "Task" [ 1079.071985] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.079420] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395424, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.582674] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395424, 'name': ReconfigVM_Task, 'duration_secs': 0.146157} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1079.582956] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Reconfigured VM instance instance-00000033 to detach disk 2000 {{(pid=68492) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1079.588265] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-517e6423-5b3f-4323-831c-bf5bde932007 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.602671] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1079.602671] env[68492]: value = "task-3395425" [ 1079.602671] env[68492]: _type = "Task" [ 1079.602671] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.610721] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395425, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1080.112119] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395425, 'name': ReconfigVM_Task, 'duration_secs': 0.118636} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1080.112458] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677484', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'name': 'volume-ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e3ea0b7a-bc22-4285-bcdd-560c509c09e9', 'attached_at': '', 'detached_at': '', 'volume_id': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901', 'serial': 'ea04d3be-c3f1-462d-a8b4-49bbf1089901'} {{(pid=68492) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1080.112768] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1080.113540] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c09309b-0e38-4750-9d4c-e8b7ec728b6d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.119864] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1080.120093] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d1284901-e882-499c-9c8a-27250b722dbf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.173795] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1080.174027] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1080.174218] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Deleting the datastore file [datastore2] e3ea0b7a-bc22-4285-bcdd-560c509c09e9 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1080.174504] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0a24aa3b-3c8b-490a-a634-8bc6d2c5a1ed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.180443] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for the task: (returnval){ [ 1080.180443] env[68492]: value = "task-3395427" [ 1080.180443] env[68492]: _type = "Task" [ 1080.180443] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1080.188210] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395427, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1080.690549] env[68492]: DEBUG oslo_vmware.api [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Task: {'id': task-3395427, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071972} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1080.690833] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1080.691029] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1080.691212] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1080.691382] env[68492]: INFO nova.compute.manager [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Took 2.22 seconds to destroy the instance on the hypervisor. [ 1080.691622] env[68492]: DEBUG oslo.service.loopingcall [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1080.691805] env[68492]: DEBUG nova.compute.manager [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1080.691902] env[68492]: DEBUG nova.network.neutron [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1081.272195] env[68492]: DEBUG nova.network.neutron [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.285680] env[68492]: DEBUG nova.compute.manager [req-683df05c-be29-4a19-8149-c6d7e0096e7f req-4b1578c3-af16-457f-9e95-662bb89404ac service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Received event network-vif-deleted-e95e473d-6881-40c4-9e71-6c38a271c1ef {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1081.285888] env[68492]: INFO nova.compute.manager [req-683df05c-be29-4a19-8149-c6d7e0096e7f req-4b1578c3-af16-457f-9e95-662bb89404ac service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Neutron deleted interface e95e473d-6881-40c4-9e71-6c38a271c1ef; detaching it from the instance and deleting it from the info cache [ 1081.286150] env[68492]: DEBUG nova.network.neutron [req-683df05c-be29-4a19-8149-c6d7e0096e7f req-4b1578c3-af16-457f-9e95-662bb89404ac service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.292120] env[68492]: INFO nova.compute.manager [-] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Took 0.60 seconds to deallocate network for instance. [ 1081.298579] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-523a2a18-6aaf-4a44-87c8-b97450eb516f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.310259] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7a31d39-4d7d-4343-bc3f-dad78fd20f5c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.354171] env[68492]: DEBUG nova.compute.manager [req-683df05c-be29-4a19-8149-c6d7e0096e7f req-4b1578c3-af16-457f-9e95-662bb89404ac service nova] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Detach interface failed, port_id=e95e473d-6881-40c4-9e71-6c38a271c1ef, reason: Instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 could not be found. {{(pid=68492) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10941}} [ 1081.384630] env[68492]: INFO nova.compute.manager [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Took 0.09 seconds to detach 1 volumes for instance. [ 1081.385641] env[68492]: DEBUG nova.compute.manager [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Deleting volume: ea04d3be-c3f1-462d-a8b4-49bbf1089901 {{(pid=68492) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3222}} [ 1081.484074] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.484380] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.484704] env[68492]: DEBUG nova.objects.instance [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lazy-loading 'resources' on Instance uuid e3ea0b7a-bc22-4285-bcdd-560c509c09e9 {{(pid=68492) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1081.870940] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-274bc693-979d-4844-873f-6cbd95fa5d78 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.882059] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-080bf5e5-070f-4dd3-b1e6-3117036613eb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.919201] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20692242-eafb-4dfd-9cb0-b2c0addf3950 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.927348] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a461cb-9c07-4cab-8463-12df5a9a24d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.941487] env[68492]: DEBUG nova.compute.provider_tree [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1081.951373] env[68492]: DEBUG nova.scheduler.client.report [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1081.970690] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.486s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.989744] env[68492]: INFO nova.scheduler.client.report [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Deleted allocations for instance e3ea0b7a-bc22-4285-bcdd-560c509c09e9 [ 1082.054885] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8a7bf646-cc33-427e-9aed-febb383ecb86 tempest-ServersTestBootFromVolume-149343409 tempest-ServersTestBootFromVolume-149343409-project-member] Lock "e3ea0b7a-bc22-4285-bcdd-560c509c09e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.586s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1090.715579] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b53d8afa-057b-451a-8c4c-00ada6c4cc0a tempest-ServersTestJSON-1214267113 tempest-ServersTestJSON-1214267113-project-member] Acquiring lock "49885647-f6a0-468a-bf58-206de779c896" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1090.715912] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b53d8afa-057b-451a-8c4c-00ada6c4cc0a tempest-ServersTestJSON-1214267113 tempest-ServersTestJSON-1214267113-project-member] Lock "49885647-f6a0-468a-bf58-206de779c896" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1096.174200] env[68492]: WARNING oslo_vmware.rw_handles [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.174200] env[68492]: ERROR oslo_vmware.rw_handles [ 1096.174834] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1096.177076] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1096.177386] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Copying Virtual Disk [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/a8eab2cc-7154-44be-b269-3e6ed5212b4b/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1096.177773] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cb2978d9-b845-40d4-a96c-dd2ab8d5d9a8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.186441] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 1096.186441] env[68492]: value = "task-3395429" [ 1096.186441] env[68492]: _type = "Task" [ 1096.186441] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1096.194660] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395429, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1096.566198] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.696853] env[68492]: DEBUG oslo_vmware.exceptions [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1096.697164] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1096.697734] env[68492]: ERROR nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1096.697734] env[68492]: Faults: ['InvalidArgument'] [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Traceback (most recent call last): [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] yield resources [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self.driver.spawn(context, instance, image_meta, [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self._fetch_image_if_missing(context, vi) [ 1096.697734] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] image_cache(vi, tmp_image_ds_loc) [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] vm_util.copy_virtual_disk( [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] session._wait_for_task(vmdk_copy_task) [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return self.wait_for_task(task_ref) [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return evt.wait() [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] result = hub.switch() [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1096.698124] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return self.greenlet.switch() [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self.f(*self.args, **self.kw) [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] raise exceptions.translate_fault(task_info.error) [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Faults: ['InvalidArgument'] [ 1096.698524] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] [ 1096.698524] env[68492]: INFO nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Terminating instance [ 1096.699546] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1096.699764] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1096.700007] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2723c246-52f4-42c6-8a7f-72fbd7d96573 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.702397] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1096.702623] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1096.703427] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99af38d4-08f4-47fd-a266-25a1ec589037 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.710005] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1096.710240] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-06614269-b86f-4b9f-9de9-74872fbc60a3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.712427] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1096.712600] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1096.713607] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-976e5ff9-b80f-49c2-9ade-136f2cbe814b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.718442] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for the task: (returnval){ [ 1096.718442] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527490cf-4465-55a8-c43e-384f976f8743" [ 1096.718442] env[68492]: _type = "Task" [ 1096.718442] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1096.725807] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527490cf-4465-55a8-c43e-384f976f8743, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1096.772855] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1096.773090] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1096.773276] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleting the datastore file [datastore2] 14af3749-f031-4543-96e4-af0b4fd28e2b {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1096.773537] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6c492319-0f2f-4d1b-b6cf-d108071a15d6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.779728] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for the task: (returnval){ [ 1096.779728] env[68492]: value = "task-3395431" [ 1096.779728] env[68492]: _type = "Task" [ 1096.779728] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1096.786802] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395431, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1097.229010] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1097.229309] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Creating directory with path [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1097.229551] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01964514-5834-4c3c-87d6-e3fdd9cbd8e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.240692] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Created directory with path [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1097.240883] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Fetch image to [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1097.241066] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1097.241798] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a61e747-9bba-4b61-942a-aca7c536d837 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.248600] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26ee5da-2fe9-46fa-9b1c-baf183b291bb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.259032] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5447b808-854f-41c3-bb19-c3a5180ebcd5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.293277] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fae5b37-cc85-44f6-8672-98afa77fb574 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.300882] env[68492]: DEBUG oslo_vmware.api [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Task: {'id': task-3395431, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07349} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1097.301813] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1097.302017] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1097.302201] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1097.302406] env[68492]: INFO nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1097.304175] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4b20e07b-b51e-46a7-bd9d-e38d3228e799 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.306120] env[68492]: DEBUG nova.compute.claims [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1097.306280] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.306491] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1097.331456] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1097.398823] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1097.464233] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1097.464441] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1097.709860] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d12ac91a-a9e1-4cd6-a157-5b6d07767af0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.717375] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-173b890f-56f9-4fcf-90d8-e4db6a1a813e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.747437] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a9aa580-f721-4b08-b473-0560427604ee {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.754375] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5eb11bc-7481-4a75-b34f-ca17bd9fc126 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.767316] env[68492]: DEBUG nova.compute.provider_tree [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1097.775752] env[68492]: DEBUG nova.scheduler.client.report [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1097.789908] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.483s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1097.790452] env[68492]: ERROR nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.790452] env[68492]: Faults: ['InvalidArgument'] [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Traceback (most recent call last): [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self.driver.spawn(context, instance, image_meta, [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self._fetch_image_if_missing(context, vi) [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] image_cache(vi, tmp_image_ds_loc) [ 1097.790452] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] vm_util.copy_virtual_disk( [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] session._wait_for_task(vmdk_copy_task) [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return self.wait_for_task(task_ref) [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return evt.wait() [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] result = hub.switch() [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] return self.greenlet.switch() [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1097.790836] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] self.f(*self.args, **self.kw) [ 1097.791201] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1097.791201] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] raise exceptions.translate_fault(task_info.error) [ 1097.791201] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.791201] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Faults: ['InvalidArgument'] [ 1097.791201] env[68492]: ERROR nova.compute.manager [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] [ 1097.791201] env[68492]: DEBUG nova.compute.utils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1097.792766] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Build of instance 14af3749-f031-4543-96e4-af0b4fd28e2b was re-scheduled: A specified parameter was not correct: fileType [ 1097.792766] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1097.793167] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1097.793354] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1097.793512] env[68492]: DEBUG nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1097.793673] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1098.293148] env[68492]: DEBUG nova.network.neutron [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.306562] env[68492]: INFO nova.compute.manager [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Took 0.51 seconds to deallocate network for instance. [ 1098.411199] env[68492]: INFO nova.scheduler.client.report [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Deleted allocations for instance 14af3749-f031-4543-96e4-af0b4fd28e2b [ 1098.433555] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc4836d2-b343-42f9-864f-452a0b3e909b tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 514.780s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1098.434794] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 315.777s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1098.434981] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Acquiring lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1098.435210] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1098.436115] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1098.437974] env[68492]: INFO nova.compute.manager [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Terminating instance [ 1098.440043] env[68492]: DEBUG nova.compute.manager [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1098.440043] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1098.440337] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-38a4c3ad-d26c-41c8-bd21-eac65b5e6fdf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.451619] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d60b654d-f12d-4931-a85e-73dade86f420 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.463845] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1098.485828] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 14af3749-f031-4543-96e4-af0b4fd28e2b could not be found. [ 1098.486095] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1098.486326] env[68492]: INFO nova.compute.manager [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1098.486617] env[68492]: DEBUG oslo.service.loopingcall [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1098.486871] env[68492]: DEBUG nova.compute.manager [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1098.486993] env[68492]: DEBUG nova.network.neutron [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1098.516147] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1098.516426] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1098.518273] env[68492]: INFO nova.compute.claims [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1098.524187] env[68492]: DEBUG nova.network.neutron [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.540521] env[68492]: INFO nova.compute.manager [-] [instance: 14af3749-f031-4543-96e4-af0b4fd28e2b] Took 0.05 seconds to deallocate network for instance. [ 1098.649807] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b2e6d74c-9b9a-4cda-9a6c-1de062ccd654 tempest-ListImageFiltersTestJSON-2023066398 tempest-ListImageFiltersTestJSON-2023066398-project-member] Lock "14af3749-f031-4543-96e4-af0b4fd28e2b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.215s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1098.883599] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5177b759-f899-4a95-a30c-a4c936348e71 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.891896] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21372e27-976d-4140-9bc6-856b03a92953 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.923237] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fb48db3-54c7-4ad9-93b9-117aa9df283f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.930472] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6406267-c8a3-4ece-8f02-58bc20068fc7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.943674] env[68492]: DEBUG nova.compute.provider_tree [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1098.952999] env[68492]: DEBUG nova.scheduler.client.report [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1098.966756] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.450s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1098.967295] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1099.001210] env[68492]: DEBUG nova.compute.utils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1099.003275] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1099.003459] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1099.010448] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1099.066159] env[68492]: DEBUG nova.policy [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '39025730d24844748c57b6e45ae54af0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '84fa3c41af4c42cbb652c4ccb0f3785e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1099.075782] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1099.104223] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1099.104854] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1099.105102] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1099.105341] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1099.106072] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1099.106072] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1099.106315] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1099.106510] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1099.107033] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1099.107271] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1099.107491] env[68492]: DEBUG nova.virt.hardware [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1099.108715] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-549d7756-6ae2-4b7b-be8b-908af7c6d5d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.117690] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6247e7f-7c11-4a67-9d8e-19229f8de5cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.391970] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Successfully created port: 6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1100.008176] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Successfully updated port: 6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1100.021999] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1100.022186] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquired lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1100.022336] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1100.069612] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1100.216084] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Updating instance_info_cache with network_info: [{"id": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "address": "fa:16:3e:93:fb:0d", "network": {"id": "dc53e794-e85b-4f1a-a9e0-14d6b0df638a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-374553970-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "84fa3c41af4c42cbb652c4ccb0f3785e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a407774d-9c2a-411d-9d6f-9ca733b97f3f", "external-id": "nsx-vlan-transportzone-710", "segmentation_id": 710, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e297e3f-ec", "ovs_interfaceid": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.228814] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Releasing lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1100.229470] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance network_info: |[{"id": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "address": "fa:16:3e:93:fb:0d", "network": {"id": "dc53e794-e85b-4f1a-a9e0-14d6b0df638a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-374553970-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "84fa3c41af4c42cbb652c4ccb0f3785e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a407774d-9c2a-411d-9d6f-9ca733b97f3f", "external-id": "nsx-vlan-transportzone-710", "segmentation_id": 710, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e297e3f-ec", "ovs_interfaceid": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1100.229650] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:93:fb:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a407774d-9c2a-411d-9d6f-9ca733b97f3f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e297e3f-ecce-455f-ba97-3f69383ce3a7', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1100.237958] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Creating folder: Project (84fa3c41af4c42cbb652c4ccb0f3785e). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.238542] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7578f7fb-5582-47b4-9554-5fdb9f384ded {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.249686] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Created folder: Project (84fa3c41af4c42cbb652c4ccb0f3785e) in parent group-v677434. [ 1100.249872] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Creating folder: Instances. Parent ref: group-v677497. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.250119] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc16fb6b-f8d9-4b65-bac0-06477e9e8f3b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.259106] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Created folder: Instances in parent group-v677497. [ 1100.259366] env[68492]: DEBUG oslo.service.loopingcall [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1100.259553] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1100.259754] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a4018177-6f1f-4fd3-94eb-8d9251370340 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.281558] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1100.281558] env[68492]: value = "task-3395434" [ 1100.281558] env[68492]: _type = "Task" [ 1100.281558] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1100.289579] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395434, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.565443] env[68492]: DEBUG nova.compute.manager [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Received event network-vif-plugged-6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1100.565705] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Acquiring lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1100.565858] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1100.566061] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1100.566203] env[68492]: DEBUG nova.compute.manager [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] No waiting events found dispatching network-vif-plugged-6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1100.566373] env[68492]: WARNING nova.compute.manager [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Received unexpected event network-vif-plugged-6e297e3f-ecce-455f-ba97-3f69383ce3a7 for instance with vm_state building and task_state spawning. [ 1100.566548] env[68492]: DEBUG nova.compute.manager [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Received event network-changed-6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1100.566705] env[68492]: DEBUG nova.compute.manager [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Refreshing instance network info cache due to event network-changed-6e297e3f-ecce-455f-ba97-3f69383ce3a7. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1100.566904] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Acquiring lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1100.567024] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Acquired lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1100.567602] env[68492]: DEBUG nova.network.neutron [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Refreshing network info cache for port 6e297e3f-ecce-455f-ba97-3f69383ce3a7 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1100.790811] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395434, 'name': CreateVM_Task} progress is 99%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.827833] env[68492]: DEBUG nova.network.neutron [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Updated VIF entry in instance network info cache for port 6e297e3f-ecce-455f-ba97-3f69383ce3a7. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1100.828211] env[68492]: DEBUG nova.network.neutron [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Updating instance_info_cache with network_info: [{"id": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "address": "fa:16:3e:93:fb:0d", "network": {"id": "dc53e794-e85b-4f1a-a9e0-14d6b0df638a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-374553970-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "84fa3c41af4c42cbb652c4ccb0f3785e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a407774d-9c2a-411d-9d6f-9ca733b97f3f", "external-id": "nsx-vlan-transportzone-710", "segmentation_id": 710, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e297e3f-ec", "ovs_interfaceid": "6e297e3f-ecce-455f-ba97-3f69383ce3a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.843215] env[68492]: DEBUG oslo_concurrency.lockutils [req-25b83371-f819-427c-a371-7659c57a391b req-2cf47d95-e814-49a8-be7c-5e4e56e0d093 service nova] Releasing lock "refresh_cache-913d527c-f9f8-43da-b539-d1e2e2b71528" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1101.290604] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395434, 'name': CreateVM_Task} progress is 99%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1101.793818] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395434, 'name': CreateVM_Task, 'duration_secs': 1.322396} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1101.798023] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1101.798023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1101.798023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1101.798023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1101.798023] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa01ae36-1415-4af3-80fd-f5801c17a1b0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.802128] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for the task: (returnval){ [ 1101.802128] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52492118-0243-680f-ae69-644ac34a6f15" [ 1101.802128] env[68492]: _type = "Task" [ 1101.802128] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1101.811872] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52492118-0243-680f-ae69-644ac34a6f15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1102.315168] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1102.315548] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1102.315746] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1104.474631] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1104.474944] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1112.226627] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1112.505367] env[68492]: WARNING oslo_vmware.rw_handles [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1112.505367] env[68492]: ERROR oslo_vmware.rw_handles [ 1112.505747] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore1 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1112.507540] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1112.507540] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Copying Virtual Disk [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore1] vmware_temp/453db936-4beb-4a4a-a180-3730fc838f42/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1112.507735] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1c0ae70b-719d-42ec-ba6f-a34c0d552ba0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.517473] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 1112.517473] env[68492]: value = "task-3395435" [ 1112.517473] env[68492]: _type = "Task" [ 1112.517473] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1112.525710] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395435, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1113.027869] env[68492]: DEBUG oslo_vmware.exceptions [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1113.028163] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Releasing lock "[datastore1] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1113.028792] env[68492]: ERROR nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1113.028792] env[68492]: Faults: ['InvalidArgument'] [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Traceback (most recent call last): [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] yield resources [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self.driver.spawn(context, instance, image_meta, [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self._fetch_image_if_missing(context, vi) [ 1113.028792] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] image_cache(vi, tmp_image_ds_loc) [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] vm_util.copy_virtual_disk( [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] session._wait_for_task(vmdk_copy_task) [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return self.wait_for_task(task_ref) [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return evt.wait() [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] result = hub.switch() [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1113.029196] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return self.greenlet.switch() [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self.f(*self.args, **self.kw) [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] raise exceptions.translate_fault(task_info.error) [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Faults: ['InvalidArgument'] [ 1113.029562] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] [ 1113.029562] env[68492]: INFO nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Terminating instance [ 1113.031312] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1113.031502] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1113.032242] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d687401-e5f7-4b91-9bb4-8b247fbb3158 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.038661] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1113.038873] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-73df00cb-b046-4c76-b7f4-a6d277a06751 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.100376] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1113.100628] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Deleting contents of the VM from datastore datastore1 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1113.100766] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleting the datastore file [datastore1] 00387f6d-880b-4a0b-a4be-afb1fe4c844b {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1113.101057] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4febe9ae-7d47-4c31-bf90-bd5dca92e76f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.108354] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for the task: (returnval){ [ 1113.108354] env[68492]: value = "task-3395437" [ 1113.108354] env[68492]: _type = "Task" [ 1113.108354] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1113.115833] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395437, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1113.619019] env[68492]: DEBUG oslo_vmware.api [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Task: {'id': task-3395437, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071111} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1113.619285] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1113.619465] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Deleted contents of the VM from datastore datastore1 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1113.619633] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1113.619813] env[68492]: INFO nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1113.621890] env[68492]: DEBUG nova.compute.claims [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1113.622079] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1113.622297] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1113.940665] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-651aaac7-866a-4e8f-adcd-26924c33316a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.948515] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42ec1eda-d927-425c-b010-6e4c906ff018 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.979246] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939e0ae1-d02d-4b6d-b746-b9023e8f5cbe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.986247] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15d710ce-f274-4b54-a990-3be4f0e842ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.999086] env[68492]: DEBUG nova.compute.provider_tree [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1114.007491] env[68492]: DEBUG nova.scheduler.client.report [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1114.022936] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.400s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.023684] env[68492]: ERROR nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.023684] env[68492]: Faults: ['InvalidArgument'] [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Traceback (most recent call last): [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self.driver.spawn(context, instance, image_meta, [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self._fetch_image_if_missing(context, vi) [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] image_cache(vi, tmp_image_ds_loc) [ 1114.023684] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] vm_util.copy_virtual_disk( [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] session._wait_for_task(vmdk_copy_task) [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return self.wait_for_task(task_ref) [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return evt.wait() [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] result = hub.switch() [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] return self.greenlet.switch() [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1114.024065] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] self.f(*self.args, **self.kw) [ 1114.024439] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1114.024439] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] raise exceptions.translate_fault(task_info.error) [ 1114.024439] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.024439] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Faults: ['InvalidArgument'] [ 1114.024439] env[68492]: ERROR nova.compute.manager [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] [ 1114.024696] env[68492]: DEBUG nova.compute.utils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1114.026684] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Build of instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b was re-scheduled: A specified parameter was not correct: fileType [ 1114.026684] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1114.027143] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1114.027336] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1114.027496] env[68492]: DEBUG nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1114.027661] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1114.332238] env[68492]: DEBUG nova.network.neutron [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1114.344541] env[68492]: INFO nova.compute.manager [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Took 0.32 seconds to deallocate network for instance. [ 1114.451944] env[68492]: INFO nova.scheduler.client.report [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Deleted allocations for instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b [ 1114.473562] env[68492]: DEBUG oslo_concurrency.lockutils [None req-88520cd6-befc-400f-b7b3-9d7558b5ca37 tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.761s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.474761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 17.909s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.474984] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Acquiring lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.475223] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.475420] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.477992] env[68492]: INFO nova.compute.manager [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Terminating instance [ 1114.479663] env[68492]: DEBUG nova.compute.manager [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1114.479915] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1114.480413] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8d16dccb-f16d-458f-b504-4a45af3e3d24 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.489897] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d588370-ea52-4773-9bc5-d9974b1595c5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.502034] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1114.524374] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 00387f6d-880b-4a0b-a4be-afb1fe4c844b could not be found. [ 1114.524611] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1114.524795] env[68492]: INFO nova.compute.manager [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1114.525046] env[68492]: DEBUG oslo.service.loopingcall [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1114.525276] env[68492]: DEBUG nova.compute.manager [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1114.525374] env[68492]: DEBUG nova.network.neutron [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1114.547359] env[68492]: DEBUG nova.network.neutron [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1114.552180] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.552410] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.554159] env[68492]: INFO nova.compute.claims [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1114.557347] env[68492]: INFO nova.compute.manager [-] [instance: 00387f6d-880b-4a0b-a4be-afb1fe4c844b] Took 0.03 seconds to deallocate network for instance. [ 1114.643475] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2a32e24d-d715-46b6-8f34-d3eb32d6e19f tempest-MigrationsAdminTest-300128343 tempest-MigrationsAdminTest-300128343-project-member] Lock "00387f6d-880b-4a0b-a4be-afb1fe4c844b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.885789] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a54df1e2-8e90-44c8-b619-2a48cb401f84 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.893188] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27378bd9-d360-4290-b6b5-0d333a2fad6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.922613] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3550b90d-4d7a-4a0c-ba97-0dda12f47eb6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.929446] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffbc18f1-db8a-4e8a-a871-53b9d73ab51c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.943010] env[68492]: DEBUG nova.compute.provider_tree [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1114.953206] env[68492]: DEBUG nova.scheduler.client.report [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1114.968910] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.969421] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1115.006410] env[68492]: DEBUG nova.compute.utils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1115.007640] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1115.007810] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1115.016673] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1115.075922] env[68492]: DEBUG nova.policy [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47a1a6a81e7746829ceef3ff3a26f156', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1237f182f6d4419691044a6dbef4f6a4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1115.079825] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1115.106314] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1115.106549] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1115.106713] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1115.106911] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1115.107074] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1115.107224] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1115.107429] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1115.107587] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1115.107786] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1115.107950] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1115.108144] env[68492]: DEBUG nova.virt.hardware [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1115.109032] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3363ec-7fa2-421f-bd5f-4e01c0da6bd5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.118277] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3580998-6014-435f-9a65-645bfb82d138 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.232896] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1115.403522] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Successfully created port: a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1116.093916] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Successfully updated port: a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1116.108071] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1116.108071] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquired lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1116.108071] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1116.148869] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1116.315949] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Updating instance_info_cache with network_info: [{"id": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "address": "fa:16:3e:f4:b5:a1", "network": {"id": "82966659-87fa-471b-ae42-28072b614f1e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-878579575-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1237f182f6d4419691044a6dbef4f6a4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bab6a6c3-1c5c-4776-b21b-dec21196d702", "external-id": "nsx-vlan-transportzone-634", "segmentation_id": 634, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1636ade-f8", "ovs_interfaceid": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1116.327440] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Releasing lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1116.327440] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance network_info: |[{"id": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "address": "fa:16:3e:f4:b5:a1", "network": {"id": "82966659-87fa-471b-ae42-28072b614f1e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-878579575-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1237f182f6d4419691044a6dbef4f6a4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bab6a6c3-1c5c-4776-b21b-dec21196d702", "external-id": "nsx-vlan-transportzone-634", "segmentation_id": 634, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1636ade-f8", "ovs_interfaceid": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1116.327761] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f4:b5:a1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bab6a6c3-1c5c-4776-b21b-dec21196d702', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a1636ade-f85c-4e0f-b4dc-41b2aaa48548', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1116.335355] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Creating folder: Project (1237f182f6d4419691044a6dbef4f6a4). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1116.335915] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-13585aaa-d774-42ff-8645-437456b7f65f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.348952] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Created folder: Project (1237f182f6d4419691044a6dbef4f6a4) in parent group-v677434. [ 1116.349223] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Creating folder: Instances. Parent ref: group-v677500. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1116.349652] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e682195e-16b4-4edd-a303-e2a8d0419874 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.361127] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Created folder: Instances in parent group-v677500. [ 1116.361395] env[68492]: DEBUG oslo.service.loopingcall [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1116.361599] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1116.361808] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88cc697b-6a63-4176-a1d5-5d863d987664 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.386498] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1116.386498] env[68492]: value = "task-3395440" [ 1116.386498] env[68492]: _type = "Task" [ 1116.386498] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1116.394268] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395440, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1116.417094] env[68492]: DEBUG nova.compute.manager [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Received event network-vif-plugged-a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1116.417304] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Acquiring lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.418470] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.418470] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.418470] env[68492]: DEBUG nova.compute.manager [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] No waiting events found dispatching network-vif-plugged-a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1116.418470] env[68492]: WARNING nova.compute.manager [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Received unexpected event network-vif-plugged-a1636ade-f85c-4e0f-b4dc-41b2aaa48548 for instance with vm_state building and task_state spawning. [ 1116.418817] env[68492]: DEBUG nova.compute.manager [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Received event network-changed-a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1116.418817] env[68492]: DEBUG nova.compute.manager [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Refreshing instance network info cache due to event network-changed-a1636ade-f85c-4e0f-b4dc-41b2aaa48548. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1116.418817] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Acquiring lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1116.418817] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Acquired lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1116.418817] env[68492]: DEBUG nova.network.neutron [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Refreshing network info cache for port a1636ade-f85c-4e0f-b4dc-41b2aaa48548 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1116.773251] env[68492]: DEBUG nova.network.neutron [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Updated VIF entry in instance network info cache for port a1636ade-f85c-4e0f-b4dc-41b2aaa48548. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1116.773611] env[68492]: DEBUG nova.network.neutron [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Updating instance_info_cache with network_info: [{"id": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "address": "fa:16:3e:f4:b5:a1", "network": {"id": "82966659-87fa-471b-ae42-28072b614f1e", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-878579575-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1237f182f6d4419691044a6dbef4f6a4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bab6a6c3-1c5c-4776-b21b-dec21196d702", "external-id": "nsx-vlan-transportzone-634", "segmentation_id": 634, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1636ade-f8", "ovs_interfaceid": "a1636ade-f85c-4e0f-b4dc-41b2aaa48548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1116.783096] env[68492]: DEBUG oslo_concurrency.lockutils [req-8dd7d717-f87f-4eef-b5ea-d313739585f5 req-cdf41188-450d-4902-adef-1f37486fc33d service nova] Releasing lock "refresh_cache-cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1116.896747] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395440, 'name': CreateVM_Task, 'duration_secs': 0.280556} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1116.896950] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1116.897609] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1116.897982] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1116.898102] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1116.898345] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-69a074e8-7373-4c5d-be33-a694f790f389 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.902836] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for the task: (returnval){ [ 1116.902836] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528b73d5-d779-751a-b251-2f33b976793c" [ 1116.902836] env[68492]: _type = "Task" [ 1116.902836] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1116.911643] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528b73d5-d779-751a-b251-2f33b976793c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1117.232703] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1117.413542] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1117.413783] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1117.413977] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.231551] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1118.231732] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1118.231857] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1118.266781] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267124] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267124] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267225] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267347] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267467] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267612] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267761] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267880] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.267998] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1118.268135] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1118.268709] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1118.268852] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1118.292072] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 1 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1118.292367] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e3ea0b7a-bc22-4285-bcdd-560c509c09e9] Instance has had 0 of 5 cleanup attempts {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11217}} [ 1119.293191] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.306034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1119.306034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.306326] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1119.306326] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1119.308781] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eba01e8-92df-494c-a0a0-d8fd0e652d69 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.319690] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bb54f49-2cf7-4ea3-b211-ad89f6b5a43b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.335743] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-366a4a3f-dc04-4df8-88fa-6ca6c2085027 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.343343] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a14feb95-1062-4a1a-8c79-f9129d360708 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1119.378854] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1119.379008] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1119.379340] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1119.570591] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.570762] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.570895] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571029] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571162] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571282] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571399] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571514] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571630] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.571747] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1119.588938] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.600483] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5c5946ea-9bda-4c9c-80cb-e8a580b74148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.612950] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 37f2e678-b217-4bf3-83e6-74d85ee8a446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.632181] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 888dac8e-013f-4024-9fa7-4cc13c361268 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.643089] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.655549] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a6bf3888-5c1a-4a12-85a9-221cbba6457b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.669262] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2785a54b-6fd5-413d-bdd1-ead082d8777b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.679502] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2d422f7c-9295-4b08-a623-ae07bacb3e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.692816] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 61d932c3-4c41-4648-b5ee-c083ed425e1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.703215] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c9618d2a-72ce-4395-b739-2585861bc446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.722019] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9bffaa25-3195-4077-a978-6b0dcc4b8ecd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.733241] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.744174] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.756128] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49885647-f6a0-468a-bf58-206de779c896 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.773464] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1119.773571] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1119.773669] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1119.801194] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1119.825209] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1119.825420] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1119.848662] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1119.876562] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1120.284538] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c873e295-a7b1-40c8-b9d7-8d88c6b39888 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.292475] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9998c520-7b84-46cb-b566-d05de8b03213 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.326392] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4776707-7506-42c1-9127-f2f05184cd68 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.333813] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a635171-55e7-4d7b-adee-9ebc8a004bfe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.347194] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1120.357943] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1120.373592] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1120.373790] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.995s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1121.021547] env[68492]: DEBUG oslo_concurrency.lockutils [None req-98361909-4d0e-4405-ae94-2821eeeea069 tempest-InstanceActionsNegativeTestJSON-1912133732 tempest-InstanceActionsNegativeTestJSON-1912133732-project-member] Acquiring lock "5bec90ae-12e8-4620-ac96-76d82e123f7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1121.021822] env[68492]: DEBUG oslo_concurrency.lockutils [None req-98361909-4d0e-4405-ae94-2821eeeea069 tempest-InstanceActionsNegativeTestJSON-1912133732 tempest-InstanceActionsNegativeTestJSON-1912133732-project-member] Lock "5bec90ae-12e8-4620-ac96-76d82e123f7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1121.232584] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1121.232584] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1121.232584] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.240928] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.231724] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.231724] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1124.231941] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1125.231378] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1125.231549] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1147.303438] env[68492]: WARNING oslo_vmware.rw_handles [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1147.303438] env[68492]: ERROR oslo_vmware.rw_handles [ 1147.304182] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1147.306513] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1147.306793] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Copying Virtual Disk [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/8796eb48-3846-4182-9d61-6b7b8ec52b67/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1147.307118] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bfee7e4f-d42a-46a1-80eb-f3f39679eac4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.315779] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for the task: (returnval){ [ 1147.315779] env[68492]: value = "task-3395441" [ 1147.315779] env[68492]: _type = "Task" [ 1147.315779] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1147.323820] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Task: {'id': task-3395441, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1147.825298] env[68492]: DEBUG oslo_vmware.exceptions [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1147.825633] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1147.826224] env[68492]: ERROR nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.826224] env[68492]: Faults: ['InvalidArgument'] [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Traceback (most recent call last): [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] yield resources [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self.driver.spawn(context, instance, image_meta, [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self._fetch_image_if_missing(context, vi) [ 1147.826224] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] image_cache(vi, tmp_image_ds_loc) [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] vm_util.copy_virtual_disk( [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] session._wait_for_task(vmdk_copy_task) [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return self.wait_for_task(task_ref) [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return evt.wait() [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] result = hub.switch() [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1147.826723] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return self.greenlet.switch() [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self.f(*self.args, **self.kw) [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] raise exceptions.translate_fault(task_info.error) [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Faults: ['InvalidArgument'] [ 1147.827086] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] [ 1147.827086] env[68492]: INFO nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Terminating instance [ 1147.828231] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1147.828437] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.828674] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e08fd70-5b12-4a28-8b22-20cd6550cccf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.831105] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1147.831299] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.832034] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-403588e3-ebec-4ca7-8a12-95f1aecea3aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.838982] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.839246] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-05afeffa-cd94-4e5c-b67f-a87097fcd5b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.841543] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.841716] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.842694] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-26ac653e-721d-4105-8f0e-733c379e85c5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.848623] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for the task: (returnval){ [ 1147.848623] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52f0e93d-039b-d842-6c68-18e51ee2b858" [ 1147.848623] env[68492]: _type = "Task" [ 1147.848623] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1147.855877] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52f0e93d-039b-d842-6c68-18e51ee2b858, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1147.915147] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1147.915371] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1147.915550] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Deleting the datastore file [datastore2] 4f1ede2c-7ee7-415f-a656-6c792a1b508c {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1147.915856] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d9dc43d4-69d2-4ea3-90a9-9dd41db9cba6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.924394] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for the task: (returnval){ [ 1147.924394] env[68492]: value = "task-3395443" [ 1147.924394] env[68492]: _type = "Task" [ 1147.924394] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1147.932303] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Task: {'id': task-3395443, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1148.361601] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1148.361855] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Creating directory with path [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.362019] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34465af7-7c49-40e0-a227-95f135d14111 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.372845] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Created directory with path [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.373065] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Fetch image to [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1148.373237] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1148.373971] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc8fc440-802e-420d-8cd8-39f49ff8ee35 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.380705] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02620da-0559-4e1c-9f58-f52279484616 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.389625] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2f96eec-acf0-4237-99de-7854a9f7c6bf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.422271] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7abeec86-e936-4e9b-a440-9ed08e5c556a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.434216] env[68492]: DEBUG oslo_vmware.api [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Task: {'id': task-3395443, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076392} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1148.434427] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6476668c-b883-4548-9250-76b72d8dc46a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.436148] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1148.436371] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1148.436544] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.436713] env[68492]: INFO nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1148.440915] env[68492]: DEBUG nova.compute.claims [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1148.440997] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1148.441298] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1148.459391] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1148.514322] env[68492]: DEBUG oslo_vmware.rw_handles [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1148.575255] env[68492]: DEBUG oslo_vmware.rw_handles [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1148.575457] env[68492]: DEBUG oslo_vmware.rw_handles [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1148.824949] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-000b4535-1835-49cd-9d43-6033dd1dbe0c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.832448] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dd26f95-bfee-48f8-af9f-36bf05845813 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.862144] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44df995e-d633-4c28-af85-c85fea88a914 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.869241] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c1984c8-6350-40bb-9bb7-785f71628394 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.882025] env[68492]: DEBUG nova.compute.provider_tree [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1148.891183] env[68492]: DEBUG nova.scheduler.client.report [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1148.907093] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.466s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1148.907625] env[68492]: ERROR nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.907625] env[68492]: Faults: ['InvalidArgument'] [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Traceback (most recent call last): [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self.driver.spawn(context, instance, image_meta, [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self._fetch_image_if_missing(context, vi) [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] image_cache(vi, tmp_image_ds_loc) [ 1148.907625] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] vm_util.copy_virtual_disk( [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] session._wait_for_task(vmdk_copy_task) [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return self.wait_for_task(task_ref) [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return evt.wait() [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] result = hub.switch() [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] return self.greenlet.switch() [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1148.908026] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] self.f(*self.args, **self.kw) [ 1148.908411] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1148.908411] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] raise exceptions.translate_fault(task_info.error) [ 1148.908411] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.908411] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Faults: ['InvalidArgument'] [ 1148.908411] env[68492]: ERROR nova.compute.manager [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] [ 1148.908411] env[68492]: DEBUG nova.compute.utils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1148.909813] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Build of instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c was re-scheduled: A specified parameter was not correct: fileType [ 1148.909813] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1148.910205] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1148.910378] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1148.910546] env[68492]: DEBUG nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1148.910709] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1149.216377] env[68492]: DEBUG nova.network.neutron [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1149.228687] env[68492]: INFO nova.compute.manager [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Took 0.32 seconds to deallocate network for instance. [ 1149.319688] env[68492]: INFO nova.scheduler.client.report [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Deleted allocations for instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c [ 1149.339795] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1ea6e151-0538-4fd4-9f1c-056f74d4c976 tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 563.922s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.341025] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.791s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.341266] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Acquiring lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1149.341408] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.341576] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.343859] env[68492]: INFO nova.compute.manager [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Terminating instance [ 1149.345627] env[68492]: DEBUG nova.compute.manager [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1149.345756] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1149.346275] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b42a116c-59a9-43de-85e1-2d93389390d0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.355482] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f241a58c-69f7-4cd1-ba6a-d9e57aa37fff {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.387021] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4f1ede2c-7ee7-415f-a656-6c792a1b508c could not be found. [ 1149.387328] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1149.387426] env[68492]: INFO nova.compute.manager [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1149.387689] env[68492]: DEBUG oslo.service.loopingcall [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1149.388066] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1149.390352] env[68492]: DEBUG nova.compute.manager [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1149.390462] env[68492]: DEBUG nova.network.neutron [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1149.429217] env[68492]: DEBUG nova.network.neutron [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1149.436357] env[68492]: INFO nova.compute.manager [-] [instance: 4f1ede2c-7ee7-415f-a656-6c792a1b508c] Took 0.05 seconds to deallocate network for instance. [ 1149.443472] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1149.444288] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.445267] env[68492]: INFO nova.compute.claims [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1149.533240] env[68492]: DEBUG oslo_concurrency.lockutils [None req-12364fa8-158c-4a7a-8d0d-67aa5c429b9c tempest-ServersV294TestFqdnHostnames-1047329863 tempest-ServersV294TestFqdnHostnames-1047329863-project-member] Lock "4f1ede2c-7ee7-415f-a656-6c792a1b508c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.192s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.803151] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6c4ebb0-57f3-4c09-bcd1-e3cd0eece357 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.810451] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc11ce2f-2388-4e44-bc47-2ee1508a6e7e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.839289] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25fc9698-87a4-48b1-ba6f-dc30e7865a31 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.845986] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb0bcaa-2188-4a8f-8392-d394934f08e5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.859887] env[68492]: DEBUG nova.compute.provider_tree [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1149.869627] env[68492]: DEBUG nova.scheduler.client.report [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1149.881977] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.438s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.882451] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1149.912687] env[68492]: DEBUG nova.compute.utils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1149.913864] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1149.914060] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1149.922385] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1149.987686] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1149.993871] env[68492]: DEBUG nova.policy [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9339190b2eaa4beb93fa2c49d0e90511', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e79b69dfed84e4fb3595a23d921d047', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1150.014593] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1150.014881] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1150.015073] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1150.015278] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1150.015429] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1150.015578] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1150.015798] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1150.015976] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1150.016158] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1150.016320] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1150.016492] env[68492]: DEBUG nova.virt.hardware [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1150.017372] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa1c7d07-6fd2-4351-b347-277cc380b624 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.025194] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea87ad2-7614-4ef9-9511-7fb5f45919e4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.331063] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Successfully created port: a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1151.139379] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Successfully updated port: a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1151.156535] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.156535] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquired lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.156625] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1151.194514] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1151.257490] env[68492]: DEBUG nova.compute.manager [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Received event network-vif-plugged-a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1151.257706] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Acquiring lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.257903] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1151.258076] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1151.258234] env[68492]: DEBUG nova.compute.manager [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] No waiting events found dispatching network-vif-plugged-a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1151.258389] env[68492]: WARNING nova.compute.manager [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Received unexpected event network-vif-plugged-a3603d03-1b1f-4508-a7bc-d03ea42b305e for instance with vm_state building and task_state spawning. [ 1151.258539] env[68492]: DEBUG nova.compute.manager [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Received event network-changed-a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1151.258688] env[68492]: DEBUG nova.compute.manager [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Refreshing instance network info cache due to event network-changed-a3603d03-1b1f-4508-a7bc-d03ea42b305e. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1151.258842] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Acquiring lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.336753] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "913d527c-f9f8-43da-b539-d1e2e2b71528" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.364415] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Updating instance_info_cache with network_info: [{"id": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "address": "fa:16:3e:94:d5:e8", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3603d03-1b", "ovs_interfaceid": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1151.378105] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Releasing lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1151.378456] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance network_info: |[{"id": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "address": "fa:16:3e:94:d5:e8", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3603d03-1b", "ovs_interfaceid": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1151.378806] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Acquired lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.379086] env[68492]: DEBUG nova.network.neutron [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Refreshing network info cache for port a3603d03-1b1f-4508-a7bc-d03ea42b305e {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1151.380732] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:d5:e8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27138a4c-60c9-45fb-bf37-4c2f765315a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a3603d03-1b1f-4508-a7bc-d03ea42b305e', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1151.389297] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Creating folder: Project (9e79b69dfed84e4fb3595a23d921d047). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1151.390278] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8e7637f4-de62-4539-aa6a-01c8e497c14f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.403648] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Created folder: Project (9e79b69dfed84e4fb3595a23d921d047) in parent group-v677434. [ 1151.403842] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Creating folder: Instances. Parent ref: group-v677503. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1151.404105] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-58b94476-e088-435a-80ad-c96861b3d213 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.413110] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Created folder: Instances in parent group-v677503. [ 1151.413341] env[68492]: DEBUG oslo.service.loopingcall [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1151.413524] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1151.413715] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5e6ae8a0-f126-47aa-8092-238930f829fa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.437503] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1151.437503] env[68492]: value = "task-3395446" [ 1151.437503] env[68492]: _type = "Task" [ 1151.437503] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1151.444237] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395446, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1151.666125] env[68492]: DEBUG nova.network.neutron [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Updated VIF entry in instance network info cache for port a3603d03-1b1f-4508-a7bc-d03ea42b305e. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1151.666125] env[68492]: DEBUG nova.network.neutron [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Updating instance_info_cache with network_info: [{"id": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "address": "fa:16:3e:94:d5:e8", "network": {"id": "b8fccf7d-ced8-43f3-aeb8-0c266de33587", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.248", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c89109061376457ab5ab750f8f509d25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27138a4c-60c9-45fb-bf37-4c2f765315a3", "external-id": "nsx-vlan-transportzone-736", "segmentation_id": 736, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa3603d03-1b", "ovs_interfaceid": "a3603d03-1b1f-4508-a7bc-d03ea42b305e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1151.676961] env[68492]: DEBUG oslo_concurrency.lockutils [req-4386c88e-56f1-4877-ac60-2df2ceb9f7df req-7f924e7c-d814-4acf-8d0f-c3fddc3d3df5 service nova] Releasing lock "refresh_cache-aacdc31e-9a31-4745-b48b-f23a3b16ae9c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1151.947059] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395446, 'name': CreateVM_Task, 'duration_secs': 0.330841} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1151.947211] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1151.947820] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.947978] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.948306] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1151.948543] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7bf34e07-3c48-4182-86d7-6bdbd373b9d3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.953029] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for the task: (returnval){ [ 1151.953029] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525a31dd-a9e3-5dad-ad48-e42fcb8bc2b4" [ 1151.953029] env[68492]: _type = "Task" [ 1151.953029] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1151.960779] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525a31dd-a9e3-5dad-ad48-e42fcb8bc2b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1152.463326] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1152.463678] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1152.464022] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1152.834867] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.202431] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1160.223799] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 1160.223799] env[68492]: value = "domain-c8" [ 1160.223799] env[68492]: _type = "ClusterComputeResource" [ 1160.223799] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1160.225158] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2af755b1-95e6-4f88-bf40-5e3d99393f2f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.242228] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 10 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1160.242400] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid cbddbd81-2931-4d28-bd69-ef3f8f1e366c {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.242590] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid fcf9c3f0-4f46-4069-887f-fd666e6b3c53 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.242747] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.242926] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 3b1ce4e1-bbad-4030-84d9-f814a44eec4a {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243108] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 569b49ff-047a-4494-b869-6598764da9d7 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243266] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 8c72085d-697c-4829-866a-4d642f18d2f6 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243415] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243585] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 913d527c-f9f8-43da-b539-d1e2e2b71528 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243747] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.243896] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid aacdc31e-9a31-4745-b48b-f23a3b16ae9c {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1160.244242] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.244495] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.244710] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.244941] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.245170] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "569b49ff-047a-4494-b869-6598764da9d7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.245373] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "8c72085d-697c-4829-866a-4d642f18d2f6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.245566] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.245754] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "913d527c-f9f8-43da-b539-d1e2e2b71528" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.245969] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1160.246194] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.274951] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1176.605046] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.231594] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1178.231876] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1178.231876] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1178.251948] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252111] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252228] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252355] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252481] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252603] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252724] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252841] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.252959] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.253089] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1178.253212] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1179.231506] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.231761] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.242852] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.243131] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.243304] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1179.243457] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1179.244593] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72e542e7-a50f-44fd-8b6f-ed231c58fdd1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.253391] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ca7f7a-e8c0-448d-b5d5-629aa18ad517 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.267211] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28ab48b4-7458-4a1a-b4e6-7db233a40711 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.273534] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5ac81f2-bf52-4005-8cbe-d54300bbd171 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.302538] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180952MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1179.302761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.302962] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.379166] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379336] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379455] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379580] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379702] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379818] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.379937] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.380066] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.380183] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.380297] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1179.393378] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5c5946ea-9bda-4c9c-80cb-e8a580b74148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.404737] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 37f2e678-b217-4bf3-83e6-74d85ee8a446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.415160] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 888dac8e-013f-4024-9fa7-4cc13c361268 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.425275] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.434852] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a6bf3888-5c1a-4a12-85a9-221cbba6457b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.444391] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2785a54b-6fd5-413d-bdd1-ead082d8777b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.453987] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2d422f7c-9295-4b08-a623-ae07bacb3e9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.463872] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 61d932c3-4c41-4648-b5ee-c083ed425e1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.477243] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c9618d2a-72ce-4395-b739-2585861bc446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.487263] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9bffaa25-3195-4077-a978-6b0dcc4b8ecd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.496735] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.506185] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.516441] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49885647-f6a0-468a-bf58-206de779c896 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.525830] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.536103] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5bec90ae-12e8-4620-ac96-76d82e123f7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1179.536361] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1179.536514] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1179.814561] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6645003c-ad54-4207-b702-493debcf5d54 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.822072] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-728c28d2-d2de-4eea-b580-40b086f22e94 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.851764] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e62703b4-36b8-4eb8-b269-3f6f3ef68e26 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.858907] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fd91d13-d900-441e-873c-d9c8dbb0507f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.872545] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1179.880612] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1179.893365] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1179.893548] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1181.889065] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.230664] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.231638] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.230899] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.231125] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.231372] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1193.846872] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1193.846872] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.165337] env[68492]: DEBUG oslo_concurrency.lockutils [None req-27765813-a0ad-45bd-9761-047a220ae9aa tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Acquiring lock "9694688e-b937-4999-9b25-3caea82695b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1194.165460] env[68492]: DEBUG oslo_concurrency.lockutils [None req-27765813-a0ad-45bd-9761-047a220ae9aa tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Lock "9694688e-b937-4999-9b25-3caea82695b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.589167] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb53b3dc-3a8e-4491-8226-264b6e926874 tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Acquiring lock "51e8e546-2bd7-495b-a81d-a6cdc4dba99c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1194.589409] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb53b3dc-3a8e-4491-8226-264b6e926874 tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Lock "51e8e546-2bd7-495b-a81d-a6cdc4dba99c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.541071] env[68492]: WARNING oslo_vmware.rw_handles [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1195.541071] env[68492]: ERROR oslo_vmware.rw_handles [ 1195.541924] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1195.543571] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1195.543779] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Copying Virtual Disk [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/58705963-7f38-44d8-bade-63cbd55bdcab/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1195.543977] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5286bbd5-74a7-476f-b049-1d7635bc55fb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.553950] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for the task: (returnval){ [ 1195.553950] env[68492]: value = "task-3395447" [ 1195.553950] env[68492]: _type = "Task" [ 1195.553950] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1195.561740] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Task: {'id': task-3395447, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1195.600582] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8d7cb0f9-a084-482b-9860-8d9014b0127f tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "6a630f7b-3c45-42b2-b8ab-e93490cc1eb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.600582] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8d7cb0f9-a084-482b-9860-8d9014b0127f tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "6a630f7b-3c45-42b2-b8ab-e93490cc1eb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1196.066290] env[68492]: DEBUG oslo_vmware.exceptions [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1196.066570] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1196.067130] env[68492]: ERROR nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.067130] env[68492]: Faults: ['InvalidArgument'] [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] yield resources [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.driver.spawn(context, instance, image_meta, [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._fetch_image_if_missing(context, vi) [ 1196.067130] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] image_cache(vi, tmp_image_ds_loc) [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] vm_util.copy_virtual_disk( [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] session._wait_for_task(vmdk_copy_task) [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.wait_for_task(task_ref) [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return evt.wait() [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = hub.switch() [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1196.067593] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.greenlet.switch() [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.f(*self.args, **self.kw) [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exceptions.translate_fault(task_info.error) [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Faults: ['InvalidArgument'] [ 1196.068029] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1196.068029] env[68492]: INFO nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Terminating instance [ 1196.069387] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1196.073145] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1196.073804] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1196.073993] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1196.074243] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f2e39f2-ec5e-4884-bdef-48fdf5377cc4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.081106] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f3dc72e-b333-42ed-988c-065c80687761 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.088068] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1196.088328] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6de04d08-9bf2-4080-832b-915458c74aa1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.091640] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1196.091817] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1196.092838] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-731d0be1-e1b3-44d6-9bce-562f177bcbaa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.098358] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for the task: (returnval){ [ 1196.098358] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5282a7b8-0ef2-9de4-2fe2-a972447bf9b9" [ 1196.098358] env[68492]: _type = "Task" [ 1196.098358] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1196.114187] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1196.114577] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Creating directory with path [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1196.114940] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-32d8c4a4-436d-4a3e-ac35-352fbe17e909 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.139200] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Created directory with path [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1196.139200] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Fetch image to [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1196.139200] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1196.139200] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5360f3f0-e9f1-4c49-ad62-8e5c5ad98f99 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.145927] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02910cf3-1ee8-4ba2-8f49-11bd245a6e57 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.156781] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30faa836-351c-4565-9a7e-8a8786d20bf2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.162028] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1196.162406] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1196.162751] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Deleting the datastore file [datastore2] cbddbd81-2931-4d28-bd69-ef3f8f1e366c {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1196.163528] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-00033c86-da22-4b0c-a203-a45375c8c94e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.192091] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f190306d-c263-4916-9fe2-feecffb429c5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.195468] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for the task: (returnval){ [ 1196.195468] env[68492]: value = "task-3395449" [ 1196.195468] env[68492]: _type = "Task" [ 1196.195468] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1196.201119] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-31283437-a356-4075-917b-30e39adfbac4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.205528] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Task: {'id': task-3395449, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1196.225032] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1196.441963] env[68492]: DEBUG oslo_vmware.rw_handles [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1196.516936] env[68492]: DEBUG oslo_vmware.rw_handles [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1196.517159] env[68492]: DEBUG oslo_vmware.rw_handles [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1196.706102] env[68492]: DEBUG oslo_vmware.api [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Task: {'id': task-3395449, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084577} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1196.706610] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1196.706610] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1196.706713] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1196.706874] env[68492]: INFO nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1196.710348] env[68492]: DEBUG nova.compute.claims [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1196.710519] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1196.710731] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.178689] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a74defed-ba21-42b8-84e7-21affb07fcb6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.188978] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-575cca48-0d7b-4b05-b0e8-9e77c3a0c123 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.229280] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed3cc3c4-1556-40e0-b9f4-c2648f3416ab {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.237628] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-381b599c-7a07-49fa-9e24-39382a7d1d2e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.251443] env[68492]: DEBUG nova.compute.provider_tree [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1197.261097] env[68492]: DEBUG nova.scheduler.client.report [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1197.280751] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.570s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.281344] env[68492]: ERROR nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1197.281344] env[68492]: Faults: ['InvalidArgument'] [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.driver.spawn(context, instance, image_meta, [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._fetch_image_if_missing(context, vi) [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] image_cache(vi, tmp_image_ds_loc) [ 1197.281344] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] vm_util.copy_virtual_disk( [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] session._wait_for_task(vmdk_copy_task) [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.wait_for_task(task_ref) [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return evt.wait() [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = hub.switch() [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.greenlet.switch() [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1197.281755] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.f(*self.args, **self.kw) [ 1197.282233] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1197.282233] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exceptions.translate_fault(task_info.error) [ 1197.282233] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1197.282233] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Faults: ['InvalidArgument'] [ 1197.282233] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.282233] env[68492]: DEBUG nova.compute.utils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1197.287445] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Build of instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c was re-scheduled: A specified parameter was not correct: fileType [ 1197.287445] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1197.287864] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1197.288053] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1197.288214] env[68492]: DEBUG nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1197.288367] env[68492]: DEBUG nova.network.neutron [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1197.558739] env[68492]: DEBUG neutronclient.v2_0.client [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1197.560932] env[68492]: ERROR nova.compute.manager [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.driver.spawn(context, instance, image_meta, [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._fetch_image_if_missing(context, vi) [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] image_cache(vi, tmp_image_ds_loc) [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1197.560932] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] vm_util.copy_virtual_disk( [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] session._wait_for_task(vmdk_copy_task) [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.wait_for_task(task_ref) [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return evt.wait() [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = hub.switch() [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.greenlet.switch() [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.f(*self.args, **self.kw) [ 1197.562025] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exceptions.translate_fault(task_info.error) [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Faults: ['InvalidArgument'] [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] During handling of the above exception, another exception occurred: [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._build_and_run_instance(context, instance, image, [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exception.RescheduledException( [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] nova.exception.RescheduledException: Build of instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c was re-scheduled: A specified parameter was not correct: fileType [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Faults: ['InvalidArgument'] [ 1197.562409] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] During handling of the above exception, another exception occurred: [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] exception_handler_v20(status_code, error_body) [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise client_exc(message=error_message, [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Neutron server returns request_ids: ['req-cfadfd7c-58b0-4545-95a4-616e93d2d9a1'] [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.562828] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] During handling of the above exception, another exception occurred: [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._deallocate_network(context, instance, requested_networks) [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.network_api.deallocate_for_instance( [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] data = neutron.list_ports(**search_opts) [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.list('ports', self.ports_path, retrieve_all, [ 1197.563197] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] for r in self._pagination(collection, path, **params): [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] res = self.get(path, params=params) [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.retry_request("GET", action, body=body, [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1197.563540] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.do_request(method, action, body=body, [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._handle_fault_response(status_code, replybody, resp) [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exception.Unauthorized() [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] nova.exception.Unauthorized: Not authorized. [ 1197.564041] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.645015] env[68492]: INFO nova.scheduler.client.report [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Deleted allocations for instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c [ 1197.672218] env[68492]: DEBUG oslo_concurrency.lockutils [None req-997773cb-7f4b-4a03-9bba-74ad6253faef tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 609.878s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.672844] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 409.334s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.673165] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Acquiring lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.673389] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.673551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.677917] env[68492]: INFO nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Terminating instance [ 1197.679832] env[68492]: DEBUG nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1197.679927] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1197.680168] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-79b9b95b-2caf-493e-a70b-c6680f2d346b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.689510] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8ba5f64-2015-463e-af86-8f1df7001229 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1197.702464] env[68492]: DEBUG nova.compute.manager [None req-f4edb98c-865a-44f7-8205-7703fad07800 tempest-ImagesOneServerNegativeTestJSON-1666722731 tempest-ImagesOneServerNegativeTestJSON-1666722731-project-member] [instance: 5c5946ea-9bda-4c9c-80cb-e8a580b74148] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1197.729045] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cbddbd81-2931-4d28-bd69-ef3f8f1e366c could not be found. [ 1197.729045] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1197.729374] env[68492]: INFO nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1197.729433] env[68492]: DEBUG oslo.service.loopingcall [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1197.729689] env[68492]: DEBUG nova.compute.manager [-] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1197.729784] env[68492]: DEBUG nova.network.neutron [-] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1197.734215] env[68492]: DEBUG nova.compute.manager [None req-f4edb98c-865a-44f7-8205-7703fad07800 tempest-ImagesOneServerNegativeTestJSON-1666722731 tempest-ImagesOneServerNegativeTestJSON-1666722731-project-member] [instance: 5c5946ea-9bda-4c9c-80cb-e8a580b74148] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1197.760213] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f4edb98c-865a-44f7-8205-7703fad07800 tempest-ImagesOneServerNegativeTestJSON-1666722731 tempest-ImagesOneServerNegativeTestJSON-1666722731-project-member] Lock "5c5946ea-9bda-4c9c-80cb-e8a580b74148" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.318s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.770211] env[68492]: DEBUG nova.compute.manager [None req-d9802bbb-f996-4d5b-916e-ae83961094c7 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: d5f69f3c-ef44-462e-817a-3258de5f5ff3] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1197.834380] env[68492]: DEBUG nova.compute.manager [None req-d9802bbb-f996-4d5b-916e-ae83961094c7 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: d5f69f3c-ef44-462e-817a-3258de5f5ff3] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1197.859390] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d9802bbb-f996-4d5b-916e-ae83961094c7 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "d5f69f3c-ef44-462e-817a-3258de5f5ff3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.206s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.872839] env[68492]: DEBUG nova.compute.manager [None req-28ac2215-fec8-4cf0-85aa-cec8c31ae2e8 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 37f2e678-b217-4bf3-83e6-74d85ee8a446] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1197.902678] env[68492]: DEBUG nova.compute.manager [None req-28ac2215-fec8-4cf0-85aa-cec8c31ae2e8 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 37f2e678-b217-4bf3-83e6-74d85ee8a446] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1197.905569] env[68492]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1197.905652] env[68492]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-b7557e61-c78b-4665-911b-daf40ac5a505'] [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1197.906386] env[68492]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1197.906924] env[68492]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1197.907597] env[68492]: ERROR oslo.service.loopingcall [ 1197.908120] env[68492]: ERROR nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1197.948413] env[68492]: ERROR nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] exception_handler_v20(status_code, error_body) [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise client_exc(message=error_message, [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1197.948413] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Neutron server returns request_ids: ['req-b7557e61-c78b-4665-911b-daf40ac5a505'] [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] During handling of the above exception, another exception occurred: [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Traceback (most recent call last): [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._delete_instance(context, instance, bdms) [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._shutdown_instance(context, instance, bdms) [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._try_deallocate_network(context, instance, requested_networks) [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] with excutils.save_and_reraise_exception(): [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1197.948756] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.force_reraise() [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise self.value [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] _deallocate_network_with_retries() [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return evt.wait() [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = hub.switch() [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.greenlet.switch() [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1197.949177] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = func(*self.args, **self.kw) [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] result = f(*args, **kwargs) [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._deallocate_network( [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self.network_api.deallocate_for_instance( [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] data = neutron.list_ports(**search_opts) [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.list('ports', self.ports_path, retrieve_all, [ 1197.949607] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] for r in self._pagination(collection, path, **params): [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] res = self.get(path, params=params) [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.retry_request("GET", action, body=body, [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1197.949992] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] return self.do_request(method, action, body=body, [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] ret = obj(*args, **kwargs) [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] self._handle_fault_response(status_code, replybody, resp) [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1197.950382] env[68492]: ERROR nova.compute.manager [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] [ 1197.952568] env[68492]: DEBUG oslo_concurrency.lockutils [None req-28ac2215-fec8-4cf0-85aa-cec8c31ae2e8 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "37f2e678-b217-4bf3-83e6-74d85ee8a446" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.385s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.967682] env[68492]: DEBUG nova.compute.manager [None req-8f99461d-48bc-4adc-b558-823ed4a0b541 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 888dac8e-013f-4024-9fa7-4cc13c361268] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1197.992461] env[68492]: DEBUG oslo_concurrency.lockutils [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.320s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.993600] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 37.749s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.993795] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1197.993968] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "cbddbd81-2931-4d28-bd69-ef3f8f1e366c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1198.000031] env[68492]: DEBUG nova.compute.manager [None req-8f99461d-48bc-4adc-b558-823ed4a0b541 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 888dac8e-013f-4024-9fa7-4cc13c361268] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1198.024175] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8f99461d-48bc-4adc-b558-823ed4a0b541 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "888dac8e-013f-4024-9fa7-4cc13c361268" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.188s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1198.037845] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1198.072985] env[68492]: INFO nova.compute.manager [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] [instance: cbddbd81-2931-4d28-bd69-ef3f8f1e366c] Successfully reverted task state from None on failure for instance. [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server [None req-42a97951-428f-42f7-ba15-7dbd68fa8a09 tempest-ServerDiagnosticsNegativeTest-1726901203 tempest-ServerDiagnosticsNegativeTest-1726901203-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-b7557e61-c78b-4665-911b-daf40ac5a505'] [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1198.080270] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1198.080852] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1198.081454] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1198.082016] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1198.082571] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1198.083170] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1198.085208] env[68492]: ERROR oslo_messaging.rpc.server [ 1198.094414] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1198.094665] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1198.096213] env[68492]: INFO nova.compute.claims [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1198.503047] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f396a75-8c01-4edd-b424-06ef12dcad17 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.509256] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c65ae953-1b26-41eb-98e1-02320d7d54b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.540565] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b6cced5-856c-4f1e-ac39-e4de7380839d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.547987] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d057ab60-4723-458a-aa66-69388bbad7f9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.561603] env[68492]: DEBUG nova.compute.provider_tree [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1198.573320] env[68492]: DEBUG nova.scheduler.client.report [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1198.592434] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.497s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1198.592824] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1198.642899] env[68492]: DEBUG nova.compute.utils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1198.648021] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1198.648021] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1198.660619] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1198.717118] env[68492]: DEBUG nova.policy [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6f89d186e7a4b418bd0240fbf6c6e27', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c964bca1dcd9433781b74f468e174a0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1198.739042] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1198.771181] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1198.771693] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1198.771693] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1198.772128] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1198.772128] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1198.772128] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1198.772303] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1198.772489] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1198.772657] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1198.773068] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1198.773068] env[68492]: DEBUG nova.virt.hardware [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1198.773884] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a615a9-144a-47a4-b104-f81cc898ac08 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.787187] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-202de706-918c-40a1-998f-6d253252f886 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.002410] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.002512] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.096489] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Successfully created port: 44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1199.976547] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Successfully updated port: 44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1199.989232] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1199.989393] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquired lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1199.989546] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1200.036626] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1200.070290] env[68492]: DEBUG nova.compute.manager [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Received event network-vif-plugged-44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1200.070516] env[68492]: DEBUG oslo_concurrency.lockutils [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] Acquiring lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1200.071075] env[68492]: DEBUG oslo_concurrency.lockutils [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.071262] env[68492]: DEBUG oslo_concurrency.lockutils [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.074321] env[68492]: DEBUG nova.compute.manager [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] No waiting events found dispatching network-vif-plugged-44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1200.074534] env[68492]: WARNING nova.compute.manager [req-3c3bccd8-eaae-4cb7-b2aa-e7cc844c747e req-67b78764-877a-4cc5-8613-4bb24b642201 service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Received unexpected event network-vif-plugged-44d33998-ed97-49bc-8b10-5cf58b57627a for instance with vm_state building and task_state spawning. [ 1200.211826] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Updating instance_info_cache with network_info: [{"id": "44d33998-ed97-49bc-8b10-5cf58b57627a", "address": "fa:16:3e:4d:29:90", "network": {"id": "e38da76f-2a4d-4999-abd7-b66566ffe9a5", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1556692174-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "c964bca1dcd9433781b74f468e174a0b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "748a5204-8f14-402c-9a6e-f3e6104db082", "external-id": "nsx-vlan-transportzone-750", "segmentation_id": 750, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44d33998-ed", "ovs_interfaceid": "44d33998-ed97-49bc-8b10-5cf58b57627a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1200.226198] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Releasing lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1200.226520] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance network_info: |[{"id": "44d33998-ed97-49bc-8b10-5cf58b57627a", "address": "fa:16:3e:4d:29:90", "network": {"id": "e38da76f-2a4d-4999-abd7-b66566ffe9a5", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1556692174-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "c964bca1dcd9433781b74f468e174a0b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "748a5204-8f14-402c-9a6e-f3e6104db082", "external-id": "nsx-vlan-transportzone-750", "segmentation_id": 750, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44d33998-ed", "ovs_interfaceid": "44d33998-ed97-49bc-8b10-5cf58b57627a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1200.226916] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4d:29:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '748a5204-8f14-402c-9a6e-f3e6104db082', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '44d33998-ed97-49bc-8b10-5cf58b57627a', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1200.235959] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Creating folder: Project (c964bca1dcd9433781b74f468e174a0b). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1200.236721] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c4c1bc2a-7f56-492b-954c-20ff8275a35a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.252164] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Created folder: Project (c964bca1dcd9433781b74f468e174a0b) in parent group-v677434. [ 1200.252164] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Creating folder: Instances. Parent ref: group-v677506. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1200.252164] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3909393d-d142-48af-8b15-d60592568486 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.258464] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Created folder: Instances in parent group-v677506. [ 1200.258722] env[68492]: DEBUG oslo.service.loopingcall [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1200.258902] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1200.259113] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e040cd50-93b6-4f45-8d79-f31b1a7e167f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.277640] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1200.277640] env[68492]: value = "task-3395452" [ 1200.277640] env[68492]: _type = "Task" [ 1200.277640] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1200.285249] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395452, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1200.787162] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395452, 'name': CreateVM_Task, 'duration_secs': 0.297129} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1200.787327] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1200.794152] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1200.794328] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1200.794645] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1200.794884] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb5adc1e-8730-4809-8b95-3c1097617615 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.799100] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for the task: (returnval){ [ 1200.799100] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5270efcb-cfda-e7b6-cca5-d93b3b088eb5" [ 1200.799100] env[68492]: _type = "Task" [ 1200.799100] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1200.806966] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5270efcb-cfda-e7b6-cca5-d93b3b088eb5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1201.309818] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1201.310232] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1201.310433] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1202.095831] env[68492]: DEBUG nova.compute.manager [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Received event network-changed-44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1202.096187] env[68492]: DEBUG nova.compute.manager [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Refreshing instance network info cache due to event network-changed-44d33998-ed97-49bc-8b10-5cf58b57627a. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1202.096535] env[68492]: DEBUG oslo_concurrency.lockutils [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] Acquiring lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1202.096810] env[68492]: DEBUG oslo_concurrency.lockutils [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] Acquired lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1202.097118] env[68492]: DEBUG nova.network.neutron [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Refreshing network info cache for port 44d33998-ed97-49bc-8b10-5cf58b57627a {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1202.337748] env[68492]: DEBUG nova.network.neutron [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Updated VIF entry in instance network info cache for port 44d33998-ed97-49bc-8b10-5cf58b57627a. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1202.338100] env[68492]: DEBUG nova.network.neutron [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Updating instance_info_cache with network_info: [{"id": "44d33998-ed97-49bc-8b10-5cf58b57627a", "address": "fa:16:3e:4d:29:90", "network": {"id": "e38da76f-2a4d-4999-abd7-b66566ffe9a5", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1556692174-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "c964bca1dcd9433781b74f468e174a0b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "748a5204-8f14-402c-9a6e-f3e6104db082", "external-id": "nsx-vlan-transportzone-750", "segmentation_id": 750, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44d33998-ed", "ovs_interfaceid": "44d33998-ed97-49bc-8b10-5cf58b57627a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1202.349141] env[68492]: DEBUG oslo_concurrency.lockutils [req-ae642cf8-f1a2-4315-96f9-d66cb1a298e6 req-31abaaff-5cef-4c21-91e3-eb991f2d17ee service nova] Releasing lock "refresh_cache-685c54e1-5251-4ea2-a4bb-fcdafe9d270c" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1204.314445] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1214.967416] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1b4f5886-26eb-4e75-ba9b-0eab140140cf tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Acquiring lock "74853d33-dc81-497b-9af3-72973e20e60b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1214.967416] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1b4f5886-26eb-4e75-ba9b-0eab140140cf tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Lock "74853d33-dc81-497b-9af3-72973e20e60b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1229.777025] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6aba1fe3-3953-4b08-abd8-cda96828956d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Acquiring lock "f5dde0b2-1403-466c-aa23-a5573915256d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1229.777617] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6aba1fe3-3953-4b08-abd8-cda96828956d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "f5dde0b2-1403-466c-aa23-a5573915256d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.226966] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1237.252111] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1239.232060] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1239.232480] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1239.232480] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1239.258266] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.258431] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.258567] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.258699] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.258815] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.258933] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.259076] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.259198] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.259313] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.259430] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1239.259548] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1239.260101] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.231091] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.244355] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1240.244671] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1240.244744] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1240.244879] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1240.246020] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf87ba1-f436-47c5-a748-b7ae8abafbb2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.256431] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b6d532-cf15-4c17-97da-846c1f66068f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.277683] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-490db37f-3274-4382-8e0b-c884d62b9365 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.284683] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7cfcac2-5943-467f-a70a-70f2eef87208 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.316096] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180916MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1240.316485] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1240.316485] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1240.402787] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.402842] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.402972] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403147] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403221] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403366] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403438] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403931] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.403982] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.404165] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1240.420858] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.431698] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.442719] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 49885647-f6a0-468a-bf58-206de779c896 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.453689] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.463730] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5bec90ae-12e8-4620-ac96-76d82e123f7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.475778] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.489526] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9694688e-b937-4999-9b25-3caea82695b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.502555] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 51e8e546-2bd7-495b-a81d-a6cdc4dba99c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.513961] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 6a630f7b-3c45-42b2-b8ab-e93490cc1eb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.524399] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.536510] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 74853d33-dc81-497b-9af3-72973e20e60b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.550790] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f5dde0b2-1403-466c-aa23-a5573915256d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1240.551045] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1240.551204] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1240.833556] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c31563a-ced7-4e1b-8a6d-9f41965a8e6c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.841461] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c5efde9-79dc-43b9-8597-62789dceabd3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.872710] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d9a492-2e00-4fe9-8559-9f0438bd5991 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.879791] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d24e7d2-4869-4b08-a0f0-d28ed7392b29 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.892551] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1240.901477] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1240.916088] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1240.916238] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1242.897893] env[68492]: WARNING oslo_vmware.rw_handles [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.897893] env[68492]: ERROR oslo_vmware.rw_handles [ 1242.897893] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1242.899602] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1242.899862] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Copying Virtual Disk [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/cf7ab69a-5c78-471a-845d-eeb0439cac11/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1242.903499] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-290d9573-f796-4a38-a275-58ad11df8cc9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.911318] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for the task: (returnval){ [ 1242.911318] env[68492]: value = "task-3395453" [ 1242.911318] env[68492]: _type = "Task" [ 1242.911318] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1242.919954] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Task: {'id': task-3395453, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.423700] env[68492]: DEBUG oslo_vmware.exceptions [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1243.424172] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1243.430123] env[68492]: ERROR nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.430123] env[68492]: Faults: ['InvalidArgument'] [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Traceback (most recent call last): [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] yield resources [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self.driver.spawn(context, instance, image_meta, [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self._fetch_image_if_missing(context, vi) [ 1243.430123] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] image_cache(vi, tmp_image_ds_loc) [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] vm_util.copy_virtual_disk( [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] session._wait_for_task(vmdk_copy_task) [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return self.wait_for_task(task_ref) [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return evt.wait() [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] result = hub.switch() [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1243.431075] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return self.greenlet.switch() [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self.f(*self.args, **self.kw) [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] raise exceptions.translate_fault(task_info.error) [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Faults: ['InvalidArgument'] [ 1243.431458] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] [ 1243.431458] env[68492]: INFO nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Terminating instance [ 1243.432923] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1243.432923] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1243.433129] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2600227f-46ee-4209-8d8c-25da5feccdda {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.436487] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1243.436661] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1243.437795] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca08fb9-5200-45b1-9652-8ae12eb6f673 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.446766] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1243.450183] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ba87757a-fc06-40a8-86e0-941841d53084 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.450183] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1243.450183] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1243.450742] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-35727abd-6440-4d77-856e-d2879f147c40 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.456651] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Waiting for the task: (returnval){ [ 1243.456651] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5295d071-7eb4-a92c-df10-e0844c06984b" [ 1243.456651] env[68492]: _type = "Task" [ 1243.456651] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1243.464715] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5295d071-7eb4-a92c-df10-e0844c06984b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.515190] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1243.515494] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1243.515690] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Deleting the datastore file [datastore2] fcf9c3f0-4f46-4069-887f-fd666e6b3c53 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1243.515967] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26270728-7ad4-4f95-beb9-f1963a60a76c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.522223] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for the task: (returnval){ [ 1243.522223] env[68492]: value = "task-3395455" [ 1243.522223] env[68492]: _type = "Task" [ 1243.522223] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1243.529932] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Task: {'id': task-3395455, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.911917] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.912224] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.968635] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1243.968903] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Creating directory with path [datastore2] vmware_temp/c1013bd6-0956-4461-99b8-06c92b6765fc/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1243.969152] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-79b926a4-40c8-4c5c-81dd-1db6eea8459c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.998271] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Created directory with path [datastore2] vmware_temp/c1013bd6-0956-4461-99b8-06c92b6765fc/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1243.998553] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Fetch image to [datastore2] vmware_temp/c1013bd6-0956-4461-99b8-06c92b6765fc/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1243.998658] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/c1013bd6-0956-4461-99b8-06c92b6765fc/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1243.999401] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e506c96-ec73-4e79-8b0a-e0d187d06343 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.007411] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f0c6900-f674-4b6c-a6cd-5ea19fb1cce6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.018283] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c99fd14-d311-4ca6-a014-b83ec6efbfd7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.033586] env[68492]: DEBUG oslo_vmware.api [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Task: {'id': task-3395455, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076804} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1244.058091] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1244.058868] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1244.058868] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1244.058868] env[68492]: INFO nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1244.061177] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-193d678b-07e2-47cf-8838-ff20a88c61cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.063978] env[68492]: DEBUG nova.compute.claims [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1244.064170] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.064388] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.071107] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0aa5c792-14e4-4073-9c5b-30b51ceed44c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.095332] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1244.230474] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1244.298339] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1244.298339] env[68492]: ERROR nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1244.298339] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1244.298700] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] yield resources [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.driver.spawn(context, instance, image_meta, [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._fetch_image_if_missing(context, vi) [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1244.299142] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image_fetch(context, vi, tmp_image_ds_loc) [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] images.fetch_image( [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] metadata = IMAGE_API.get(context, image_ref) [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return session.show(context, image_id, [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] _reraise_translated_image_exception(image_id) [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise new_exc.with_traceback(exc_trace) [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1244.299541] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1244.299910] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1244.300249] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1244.300249] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.ImageNotAuthorized: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1244.300249] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1244.301944] env[68492]: INFO nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Terminating instance [ 1244.303243] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1244.303494] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1244.306256] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dcf1007e-07da-4ad4-ad85-c008b16e4feb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.309215] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1244.309215] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1244.309803] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25312b07-d950-4061-ac63-54239dd41ef8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.317589] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1244.318650] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2b7e002f-d31c-40c2-a4f1-dcbd3a23fced {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.320196] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1244.320353] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1244.323170] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb226a2d-b3bf-4618-94d0-6e7eae420fbd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.329421] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1244.329421] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52bf70ad-4b2a-7ea0-4371-151f3375f6f6" [ 1244.329421] env[68492]: _type = "Task" [ 1244.329421] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1244.337581] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52bf70ad-4b2a-7ea0-4371-151f3375f6f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1244.381891] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1244.382137] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1244.382317] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Deleting the datastore file [datastore2] 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1244.382627] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fda5025e-bcd8-4933-a91b-16644b3a03f7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.393283] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Waiting for the task: (returnval){ [ 1244.393283] env[68492]: value = "task-3395457" [ 1244.393283] env[68492]: _type = "Task" [ 1244.393283] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1244.401694] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Task: {'id': task-3395457, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1244.527247] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b72affc6-b953-44cf-acb6-a6ca82cd2414 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.534882] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b870ce8a-8119-4292-bf13-d8558483b952 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.567450] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86267e75-7ec2-49fe-a293-fcd86fac7f94 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.574854] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15888a64-4373-471f-95c6-b11633a4a130 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.588075] env[68492]: DEBUG nova.compute.provider_tree [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1244.598622] env[68492]: DEBUG nova.scheduler.client.report [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1244.611804] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.547s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1244.612343] env[68492]: ERROR nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1244.612343] env[68492]: Faults: ['InvalidArgument'] [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Traceback (most recent call last): [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self.driver.spawn(context, instance, image_meta, [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self._fetch_image_if_missing(context, vi) [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] image_cache(vi, tmp_image_ds_loc) [ 1244.612343] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] vm_util.copy_virtual_disk( [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] session._wait_for_task(vmdk_copy_task) [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return self.wait_for_task(task_ref) [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return evt.wait() [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] result = hub.switch() [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] return self.greenlet.switch() [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1244.612791] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] self.f(*self.args, **self.kw) [ 1244.613219] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1244.613219] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] raise exceptions.translate_fault(task_info.error) [ 1244.613219] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1244.613219] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Faults: ['InvalidArgument'] [ 1244.613219] env[68492]: ERROR nova.compute.manager [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] [ 1244.613219] env[68492]: DEBUG nova.compute.utils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1244.614449] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Build of instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 was re-scheduled: A specified parameter was not correct: fileType [ 1244.614449] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1244.614875] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1244.615073] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1244.615247] env[68492]: DEBUG nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1244.615472] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1244.839621] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1244.839898] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating directory with path [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1244.840164] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ae60618-6a00-4779-a319-107f8cdf0084 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.851344] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created directory with path [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1244.851540] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Fetch image to [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1244.851711] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1244.852545] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e27f1e7-8806-4ae7-8821-4da056218e55 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.859679] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-607afda1-7571-4462-b38f-0b85279b0e5d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.869307] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b643ca44-da69-4da9-8761-9b993a9e66f5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.904789] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947ad219-539b-4f42-9258-e43f9e3e6979 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.911796] env[68492]: DEBUG oslo_vmware.api [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Task: {'id': task-3395457, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.106078} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1244.913302] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1244.913564] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1244.913710] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1244.913952] env[68492]: INFO nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1244.915778] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-50293b16-cb16-4a21-bc76-3f39eec2cc91 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.918666] env[68492]: DEBUG nova.compute.claims [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1244.918839] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.919063] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.941401] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1245.013463] env[68492]: DEBUG oslo_vmware.rw_handles [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1245.077704] env[68492]: DEBUG oslo_vmware.rw_handles [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1245.078029] env[68492]: DEBUG oslo_vmware.rw_handles [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1245.124942] env[68492]: DEBUG nova.network.neutron [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1245.141771] env[68492]: INFO nova.compute.manager [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Took 0.53 seconds to deallocate network for instance. [ 1245.269739] env[68492]: INFO nova.scheduler.client.report [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Deleted allocations for instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 [ 1245.299551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-719b3f54-bc2b-4048-9738-4929ffbb7c95 tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 654.781s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.300537] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 456.396s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.300757] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Acquiring lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1245.300960] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.301148] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.303107] env[68492]: INFO nova.compute.manager [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Terminating instance [ 1245.305039] env[68492]: DEBUG nova.compute.manager [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1245.305231] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1245.305735] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b306e4eb-1ee0-476d-a42b-9313e683e50d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.317216] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afec41bd-0f47-44da-aa39-595eaae53cb4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.328947] env[68492]: DEBUG nova.compute.manager [None req-8c32eff8-b21b-4192-ba43-0a04f901898a tempest-ServerShowV254Test-391686084 tempest-ServerShowV254Test-391686084-project-member] [instance: a6bf3888-5c1a-4a12-85a9-221cbba6457b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.352558] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fcf9c3f0-4f46-4069-887f-fd666e6b3c53 could not be found. [ 1245.355083] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1245.355083] env[68492]: INFO nova.compute.manager [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1245.355083] env[68492]: DEBUG oslo.service.loopingcall [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1245.355083] env[68492]: DEBUG nova.compute.manager [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1245.355083] env[68492]: DEBUG nova.network.neutron [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1245.362571] env[68492]: DEBUG nova.compute.manager [None req-8c32eff8-b21b-4192-ba43-0a04f901898a tempest-ServerShowV254Test-391686084 tempest-ServerShowV254Test-391686084-project-member] [instance: a6bf3888-5c1a-4a12-85a9-221cbba6457b] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.385143] env[68492]: DEBUG nova.network.neutron [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1245.397272] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8c32eff8-b21b-4192-ba43-0a04f901898a tempest-ServerShowV254Test-391686084 tempest-ServerShowV254Test-391686084-project-member] Lock "a6bf3888-5c1a-4a12-85a9-221cbba6457b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.681s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.397893] env[68492]: INFO nova.compute.manager [-] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] Took 0.04 seconds to deallocate network for instance. [ 1245.413310] env[68492]: DEBUG nova.compute.manager [None req-b4483b89-80dc-48ce-8ff6-d66c4bfdd20a tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] [instance: 2785a54b-6fd5-413d-bdd1-ead082d8777b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.447653] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4344d3f5-4970-448b-adf1-227b9e483d45 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.451711] env[68492]: DEBUG nova.compute.manager [None req-b4483b89-80dc-48ce-8ff6-d66c4bfdd20a tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] [instance: 2785a54b-6fd5-413d-bdd1-ead082d8777b] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.458448] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db65928-6b42-42fd-974b-24f013bbd4be {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.500372] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6416eeaa-2319-4681-9b3d-b5908b403f46 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.512395] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-507828bd-ccb8-4d55-9307-3209c0a8cdaf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.515849] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b4483b89-80dc-48ce-8ff6-d66c4bfdd20a tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Lock "2785a54b-6fd5-413d-bdd1-ead082d8777b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.864s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.526906] env[68492]: DEBUG nova.compute.provider_tree [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1245.528940] env[68492]: DEBUG nova.compute.manager [None req-410dc1bf-9835-4db5-8451-2b7d653584bd tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: 2d422f7c-9295-4b08-a623-ae07bacb3e9d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.542576] env[68492]: DEBUG nova.scheduler.client.report [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1245.560395] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.640s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.560395] env[68492]: ERROR nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1245.560395] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1245.560738] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.driver.spawn(context, instance, image_meta, [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._fetch_image_if_missing(context, vi) [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image_fetch(context, vi, tmp_image_ds_loc) [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1245.561065] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] images.fetch_image( [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] metadata = IMAGE_API.get(context, image_ref) [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return session.show(context, image_id, [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] _reraise_translated_image_exception(image_id) [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise new_exc.with_traceback(exc_trace) [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1245.561518] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1245.561924] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.ImageNotAuthorized: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1245.562447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.562447] env[68492]: DEBUG nova.compute.utils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1245.562744] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Build of instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f was re-scheduled: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1245.563238] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1245.563381] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1245.563531] env[68492]: DEBUG nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1245.563698] env[68492]: DEBUG nova.network.neutron [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1245.567827] env[68492]: DEBUG nova.compute.manager [None req-410dc1bf-9835-4db5-8451-2b7d653584bd tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] [instance: 2d422f7c-9295-4b08-a623-ae07bacb3e9d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.571535] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eab3cd8a-45f3-4dc9-86fd-f6300e5d76be tempest-ServersWithSpecificFlavorTestJSON-986118985 tempest-ServersWithSpecificFlavorTestJSON-986118985-project-member] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.271s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.572658] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 85.328s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.572862] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fcf9c3f0-4f46-4069-887f-fd666e6b3c53] During sync_power_state the instance has a pending task (deleting). Skip. [ 1245.573050] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "fcf9c3f0-4f46-4069-887f-fd666e6b3c53" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.598390] env[68492]: DEBUG oslo_concurrency.lockutils [None req-410dc1bf-9835-4db5-8451-2b7d653584bd tempest-AttachInterfacesTestJSON-1420994283 tempest-AttachInterfacesTestJSON-1420994283-project-member] Lock "2d422f7c-9295-4b08-a623-ae07bacb3e9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.729s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.615193] env[68492]: DEBUG nova.compute.manager [None req-bc1832ca-ef49-4005-9506-ca15c7b0e976 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: 61d932c3-4c41-4648-b5ee-c083ed425e1c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.639748] env[68492]: DEBUG nova.compute.manager [None req-bc1832ca-ef49-4005-9506-ca15c7b0e976 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: 61d932c3-4c41-4648-b5ee-c083ed425e1c] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.663425] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bc1832ca-ef49-4005-9506-ca15c7b0e976 tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "61d932c3-4c41-4648-b5ee-c083ed425e1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.281s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.675846] env[68492]: DEBUG nova.compute.manager [None req-109c6c0f-9e3b-4501-99b3-c0860c4ee4a4 tempest-InstanceActionsTestJSON-1991500879 tempest-InstanceActionsTestJSON-1991500879-project-member] [instance: c9618d2a-72ce-4395-b739-2585861bc446] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.701716] env[68492]: DEBUG nova.compute.manager [None req-109c6c0f-9e3b-4501-99b3-c0860c4ee4a4 tempest-InstanceActionsTestJSON-1991500879 tempest-InstanceActionsTestJSON-1991500879-project-member] [instance: c9618d2a-72ce-4395-b739-2585861bc446] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.703281] env[68492]: DEBUG neutronclient.v2_0.client [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1245.704485] env[68492]: ERROR nova.compute.manager [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1245.704485] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.704826] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.driver.spawn(context, instance, image_meta, [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._fetch_image_if_missing(context, vi) [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image_fetch(context, vi, tmp_image_ds_loc) [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] images.fetch_image( [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] metadata = IMAGE_API.get(context, image_ref) [ 1245.705222] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return session.show(context, image_id, [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] _reraise_translated_image_exception(image_id) [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise new_exc.with_traceback(exc_trace) [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = getattr(controller, method)(*args, **kwargs) [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._get(image_id) [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1245.705602] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] resp, body = self.http_client.get(url, headers=header) [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.request(url, 'GET', **kwargs) [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self._handle_response(resp) [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exc.from_response(resp, resp.content) [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.ImageNotAuthorized: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.705965] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._build_and_run_instance(context, instance, image, [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exception.RescheduledException( [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.RescheduledException: Build of instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f was re-scheduled: Not authorized for image 595bda25-3485-4d7e-9f66-50f61186cadc. [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1245.706436] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] exception_handler_v20(status_code, error_body) [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise client_exc(message=error_message, [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Neutron server returns request_ids: ['req-74d5a2ab-646a-46ae-a1e7-8ef56226579f'] [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._deallocate_network(context, instance, requested_networks) [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.network_api.deallocate_for_instance( [ 1245.706866] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] data = neutron.list_ports(**search_opts) [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.list('ports', self.ports_path, retrieve_all, [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] for r in self._pagination(collection, path, **params): [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] res = self.get(path, params=params) [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.707379] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.retry_request("GET", action, body=body, [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.do_request(method, action, body=body, [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._handle_fault_response(status_code, replybody, resp) [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exception.Unauthorized() [ 1245.707833] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.Unauthorized: Not authorized. [ 1245.710220] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1245.728026] env[68492]: DEBUG oslo_concurrency.lockutils [None req-109c6c0f-9e3b-4501-99b3-c0860c4ee4a4 tempest-InstanceActionsTestJSON-1991500879 tempest-InstanceActionsTestJSON-1991500879-project-member] Lock "c9618d2a-72ce-4395-b739-2585861bc446" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.301s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.740842] env[68492]: DEBUG nova.compute.manager [None req-74ebf5f0-6bb4-41b3-876a-2ece4ed79bbc tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] [instance: 9bffaa25-3195-4077-a978-6b0dcc4b8ecd] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.768584] env[68492]: DEBUG nova.compute.manager [None req-74ebf5f0-6bb4-41b3-876a-2ece4ed79bbc tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] [instance: 9bffaa25-3195-4077-a978-6b0dcc4b8ecd] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1245.777732] env[68492]: INFO nova.scheduler.client.report [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Deleted allocations for instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f [ 1245.794834] env[68492]: DEBUG oslo_concurrency.lockutils [None req-74ebf5f0-6bb4-41b3-876a-2ece4ed79bbc tempest-AttachVolumeTestJSON-1751926934 tempest-AttachVolumeTestJSON-1751926934-project-member] Lock "9bffaa25-3195-4077-a978-6b0dcc4b8ecd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.853s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.799573] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ec703a5c-9f2a-442f-b39e-e9cca402a781 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.296s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.800549] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.871s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.800766] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Acquiring lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1245.800969] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.801148] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.803267] env[68492]: INFO nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Terminating instance [ 1245.804691] env[68492]: DEBUG nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1245.805057] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1245.805776] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-40906cdf-aace-4d33-980c-ed8c817a0146 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.814717] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78680f4d-9265-45d3-b535-93727f54424b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.829983] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.831262] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1245.851416] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f could not be found. [ 1245.851752] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1245.852912] env[68492]: INFO nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1245.852912] env[68492]: DEBUG oslo.service.loopingcall [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1245.852912] env[68492]: DEBUG nova.compute.manager [-] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1245.852912] env[68492]: DEBUG nova.network.neutron [-] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1245.887035] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1245.887381] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1245.889037] env[68492]: INFO nova.compute.claims [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1245.899604] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1245.978659] env[68492]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1245.978906] env[68492]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-b0ef03f7-a9a0-4230-9c26-d67ecf20c3a1'] [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1245.980367] env[68492]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1245.980918] env[68492]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1245.981464] env[68492]: ERROR oslo.service.loopingcall [ 1245.982106] env[68492]: ERROR nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1246.014836] env[68492]: ERROR nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] exception_handler_v20(status_code, error_body) [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise client_exc(message=error_message, [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Neutron server returns request_ids: ['req-b0ef03f7-a9a0-4230-9c26-d67ecf20c3a1'] [ 1246.014836] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During handling of the above exception, another exception occurred: [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Traceback (most recent call last): [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._delete_instance(context, instance, bdms) [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._shutdown_instance(context, instance, bdms) [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._try_deallocate_network(context, instance, requested_networks) [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] with excutils.save_and_reraise_exception(): [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.015266] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.force_reraise() [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise self.value [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] _deallocate_network_with_retries() [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return evt.wait() [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = hub.switch() [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.greenlet.switch() [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1246.015684] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = func(*self.args, **self.kw) [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] result = f(*args, **kwargs) [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._deallocate_network( [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self.network_api.deallocate_for_instance( [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] data = neutron.list_ports(**search_opts) [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.list('ports', self.ports_path, retrieve_all, [ 1246.016046] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] for r in self._pagination(collection, path, **params): [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] res = self.get(path, params=params) [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.retry_request("GET", action, body=body, [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1246.016447] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] return self.do_request(method, action, body=body, [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] ret = obj(*args, **kwargs) [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] self._handle_fault_response(status_code, replybody, resp) [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1246.016831] env[68492]: ERROR nova.compute.manager [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] [ 1246.054352] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "03afef99-e2dd-4467-8426-fbe50481aa6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1246.054584] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1246.056487] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.256s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1246.057730] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 85.813s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1246.057937] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1246.058125] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "93eb7973-ebd9-4e69-a7ab-5a3036c3f94f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1246.109752] env[68492]: INFO nova.compute.manager [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] [instance: 93eb7973-ebd9-4e69-a7ab-5a3036c3f94f] Successfully reverted task state from None on failure for instance. [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server [None req-eb88b088-63f8-4270-8674-93d76cd28d49 tempest-ServerExternalEventsTest-61186214 tempest-ServerExternalEventsTest-61186214-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-b0ef03f7-a9a0-4230-9c26-d67ecf20c3a1'] [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1246.113193] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1246.113671] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1246.114573] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1246.115365] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1246.116025] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1246.116605] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1246.117172] env[68492]: ERROR oslo_messaging.rpc.server [ 1246.216330] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6753ccc8-e81b-4379-8ed2-a0be714fe900 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.223646] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d49da86-a665-456f-b579-acc422f030eb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.253072] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1246.253805] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1246.253955] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1246.254667] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f38e6bec-f260-4b03-b68c-0c5bfa2713b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.261844] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be7b02d-ddda-49f5-9507-ac58c7300254 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.275587] env[68492]: DEBUG nova.compute.provider_tree [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1246.283756] env[68492]: DEBUG nova.scheduler.client.report [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1246.299082] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.412s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1246.299595] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1246.301750] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.402s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1246.303373] env[68492]: INFO nova.compute.claims [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1246.331756] env[68492]: DEBUG nova.compute.utils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1246.333312] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1246.333615] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1246.346940] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1246.409114] env[68492]: DEBUG nova.policy [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '721b0fa31ea449c88dc7dcf86ab7b74c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '514e008c899841c2ae6cd90a3519df72', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1246.424708] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1246.455130] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1246.455442] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1246.455627] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1246.455856] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1246.456063] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1246.456259] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1246.456471] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1246.456631] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1246.456829] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1246.457015] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1246.457228] env[68492]: DEBUG nova.virt.hardware [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1246.458306] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f143ed-2539-4e3b-b196-72d1bc9c0ed0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.467523] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-806ce9d4-0bda-4414-8b03-489fb12469d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.628015] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b87e943-5481-48b3-b100-99756eda03af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.635610] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c053bd9d-4c1f-4d39-b86a-05fb8b907c4f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.665316] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c80d19b-32f7-430b-9f2a-3933fa7ea317 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.671919] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1993e1e9-f583-4527-97e1-4d9e9c0d146a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.684886] env[68492]: DEBUG nova.compute.provider_tree [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1246.694503] env[68492]: DEBUG nova.scheduler.client.report [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1246.709374] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1246.709374] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1246.755241] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Successfully created port: 249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1246.761544] env[68492]: DEBUG nova.compute.utils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1246.762742] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1246.762903] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1246.775294] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1246.823404] env[68492]: DEBUG nova.policy [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b3e0da6029c47fd95d02b8dd96b34be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03f130a3aa664316941ed2021c0ff9d2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1246.849122] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1246.877182] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1246.877440] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1246.877599] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1246.877778] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1246.877920] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1246.878153] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1246.878388] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1246.878556] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1246.878724] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1246.878887] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1246.879070] env[68492]: DEBUG nova.virt.hardware [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1246.879946] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-700860e9-4546-452e-8026-f2283e1aa9fd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.889755] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c932f30-bec1-40c8-a946-599a926bca05 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.463158] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Successfully created port: 8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1247.578149] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Successfully updated port: 249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1247.588121] env[68492]: DEBUG nova.compute.manager [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Received event network-vif-plugged-249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1247.588181] env[68492]: DEBUG oslo_concurrency.lockutils [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] Acquiring lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.588356] env[68492]: DEBUG oslo_concurrency.lockutils [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1247.588523] env[68492]: DEBUG oslo_concurrency.lockutils [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1247.588698] env[68492]: DEBUG nova.compute.manager [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] No waiting events found dispatching network-vif-plugged-249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1247.588862] env[68492]: WARNING nova.compute.manager [req-91a570bf-fa7a-4636-9263-decb27c30c87 req-d6ee5f7e-2389-4d1a-bd24-4cdf96daad8a service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Received unexpected event network-vif-plugged-249dba2a-46a1-4d08-bc83-b0bf41fe657d for instance with vm_state building and task_state spawning. [ 1247.597275] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1247.597275] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1247.597275] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1247.635804] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1247.803842] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Updating instance_info_cache with network_info: [{"id": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "address": "fa:16:3e:fd:3b:a0", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap249dba2a-46", "ovs_interfaceid": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1247.814963] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1247.815255] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance network_info: |[{"id": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "address": "fa:16:3e:fd:3b:a0", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap249dba2a-46", "ovs_interfaceid": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1247.815684] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:3b:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '11b669be-fb26-4ef8-bdb6-c77ab9d06daf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '249dba2a-46a1-4d08-bc83-b0bf41fe657d', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1247.823268] env[68492]: DEBUG oslo.service.loopingcall [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1247.823770] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1247.824016] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8aa0860c-bde7-49ce-812a-b52d3e76d2a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.843418] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1247.843418] env[68492]: value = "task-3395458" [ 1247.843418] env[68492]: _type = "Task" [ 1247.843418] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1247.854534] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395458, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.178238] env[68492]: DEBUG nova.compute.manager [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Received event network-vif-plugged-8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1248.178238] env[68492]: DEBUG oslo_concurrency.lockutils [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] Acquiring lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.178388] env[68492]: DEBUG oslo_concurrency.lockutils [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.178708] env[68492]: DEBUG oslo_concurrency.lockutils [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1248.178708] env[68492]: DEBUG nova.compute.manager [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] No waiting events found dispatching network-vif-plugged-8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1248.178868] env[68492]: WARNING nova.compute.manager [req-8253eb80-1d01-4ad2-9fb1-233b8dac95a0 req-916d579b-4d7f-4e20-b83a-cd59c8abb9de service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Received unexpected event network-vif-plugged-8728dbcf-fdb8-4112-be74-a7b73987ac09 for instance with vm_state building and task_state spawning. [ 1248.253629] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Successfully updated port: 8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1248.267237] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1248.267372] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquired lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1248.267513] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1248.353668] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395458, 'name': CreateVM_Task, 'duration_secs': 0.337691} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1248.353855] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1248.354543] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1248.354715] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1248.355049] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1248.355313] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7dce95ed-a111-44d9-9899-57e023453437 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.360125] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1248.360125] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]529ac032-2315-7437-f8f7-ce8cc125f203" [ 1248.360125] env[68492]: _type = "Task" [ 1248.360125] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.368295] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]529ac032-2315-7437-f8f7-ce8cc125f203, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.540666] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1248.865184] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Updating instance_info_cache with network_info: [{"id": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "address": "fa:16:3e:1e:35:77", "network": {"id": "4ee1bb7a-b9e6-4435-8bf2-65e70d2cc5c9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1593017181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03f130a3aa664316941ed2021c0ff9d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8728dbcf-fd", "ovs_interfaceid": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1248.872273] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1248.872415] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1248.872615] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1248.876419] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Releasing lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1248.876681] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance network_info: |[{"id": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "address": "fa:16:3e:1e:35:77", "network": {"id": "4ee1bb7a-b9e6-4435-8bf2-65e70d2cc5c9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1593017181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03f130a3aa664316941ed2021c0ff9d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8728dbcf-fd", "ovs_interfaceid": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1248.878304] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1e:35:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '274afb4c-04df-4213-8ad2-8f48a10d78a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8728dbcf-fdb8-4112-be74-a7b73987ac09', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1248.884545] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Creating folder: Project (03f130a3aa664316941ed2021c0ff9d2). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1248.884865] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-18f66b59-1dc9-4485-ae33-e044fd8a43ac {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.894668] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Created folder: Project (03f130a3aa664316941ed2021c0ff9d2) in parent group-v677434. [ 1248.894851] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Creating folder: Instances. Parent ref: group-v677510. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1248.895090] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-473bbd15-5a6f-4e60-aa38-3fd3beced12a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.904707] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Created folder: Instances in parent group-v677510. [ 1248.904941] env[68492]: DEBUG oslo.service.loopingcall [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1248.905142] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1248.905339] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f0469a38-da92-40f6-b094-e2390ae448ba {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.924667] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1248.924667] env[68492]: value = "task-3395461" [ 1248.924667] env[68492]: _type = "Task" [ 1248.924667] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.932233] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395461, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.434546] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395461, 'name': CreateVM_Task, 'duration_secs': 0.325848} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1249.434718] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1249.435344] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1249.435530] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1249.435874] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1249.436137] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f2e0891-5562-4b90-8895-0cff9c964fc6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.440240] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for the task: (returnval){ [ 1249.440240] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52b90cce-6b25-038a-4092-b9a1e54f22d6" [ 1249.440240] env[68492]: _type = "Task" [ 1249.440240] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1249.447675] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52b90cce-6b25-038a-4092-b9a1e54f22d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.615346] env[68492]: DEBUG nova.compute.manager [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Received event network-changed-249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1249.615629] env[68492]: DEBUG nova.compute.manager [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Refreshing instance network info cache due to event network-changed-249dba2a-46a1-4d08-bc83-b0bf41fe657d. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1249.615783] env[68492]: DEBUG oslo_concurrency.lockutils [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] Acquiring lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1249.615911] env[68492]: DEBUG oslo_concurrency.lockutils [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] Acquired lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1249.616080] env[68492]: DEBUG nova.network.neutron [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Refreshing network info cache for port 249dba2a-46a1-4d08-bc83-b0bf41fe657d {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1249.933749] env[68492]: DEBUG nova.network.neutron [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Updated VIF entry in instance network info cache for port 249dba2a-46a1-4d08-bc83-b0bf41fe657d. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1249.934107] env[68492]: DEBUG nova.network.neutron [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Updating instance_info_cache with network_info: [{"id": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "address": "fa:16:3e:fd:3b:a0", "network": {"id": "c38e131e-20a7-47d1-ae6a-f040e2f509f5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1475193371-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "514e008c899841c2ae6cd90a3519df72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11b669be-fb26-4ef8-bdb6-c77ab9d06daf", "external-id": "nsx-vlan-transportzone-633", "segmentation_id": 633, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap249dba2a-46", "ovs_interfaceid": "249dba2a-46a1-4d08-bc83-b0bf41fe657d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1249.944596] env[68492]: DEBUG oslo_concurrency.lockutils [req-12efabb1-6e02-4e6c-9e07-17513f296598 req-a2147270-6e29-4eec-807a-bfc6f5dcd930 service nova] Releasing lock "refresh_cache-e1c7c4bb-fb65-450c-8c28-11ccf986fe94" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1249.951913] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1249.952127] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1249.952318] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1250.211983] env[68492]: DEBUG nova.compute.manager [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Received event network-changed-8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1250.212207] env[68492]: DEBUG nova.compute.manager [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Refreshing instance network info cache due to event network-changed-8728dbcf-fdb8-4112-be74-a7b73987ac09. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1250.212418] env[68492]: DEBUG oslo_concurrency.lockutils [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] Acquiring lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1250.212559] env[68492]: DEBUG oslo_concurrency.lockutils [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] Acquired lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1250.212713] env[68492]: DEBUG nova.network.neutron [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Refreshing network info cache for port 8728dbcf-fdb8-4112-be74-a7b73987ac09 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1250.568891] env[68492]: DEBUG nova.network.neutron [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Updated VIF entry in instance network info cache for port 8728dbcf-fdb8-4112-be74-a7b73987ac09. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1250.569261] env[68492]: DEBUG nova.network.neutron [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Updating instance_info_cache with network_info: [{"id": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "address": "fa:16:3e:1e:35:77", "network": {"id": "4ee1bb7a-b9e6-4435-8bf2-65e70d2cc5c9", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1593017181-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03f130a3aa664316941ed2021c0ff9d2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8728dbcf-fd", "ovs_interfaceid": "8728dbcf-fdb8-4112-be74-a7b73987ac09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1250.578650] env[68492]: DEBUG oslo_concurrency.lockutils [req-2ab58458-4ea0-4b22-9d1e-8676e2f6ed97 req-5ee3af20-1d17-4ab4-becc-7d40a1285b07 service nova] Releasing lock "refresh_cache-29397c54-4bb2-4b43-afcb-9969d8dec996" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1252.075200] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "29397c54-4bb2-4b43-afcb-9969d8dec996" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.406608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1292.221215] env[68492]: WARNING oslo_vmware.rw_handles [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1292.221215] env[68492]: ERROR oslo_vmware.rw_handles [ 1292.221939] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1292.224345] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1292.224597] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Copying Virtual Disk [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/b0e8d4b4-6782-473e-aaac-e6e7a6e3ff6b/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1292.225314] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4b29b76d-ab95-44e2-b59c-6870c4e9c4d5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.233671] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1292.233671] env[68492]: value = "task-3395462" [ 1292.233671] env[68492]: _type = "Task" [ 1292.233671] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1292.241754] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395462, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1292.744395] env[68492]: DEBUG oslo_vmware.exceptions [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1292.744900] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1292.745564] env[68492]: ERROR nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.745564] env[68492]: Faults: ['InvalidArgument'] [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Traceback (most recent call last): [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] yield resources [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self.driver.spawn(context, instance, image_meta, [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self._fetch_image_if_missing(context, vi) [ 1292.745564] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] image_cache(vi, tmp_image_ds_loc) [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] vm_util.copy_virtual_disk( [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] session._wait_for_task(vmdk_copy_task) [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return self.wait_for_task(task_ref) [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return evt.wait() [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] result = hub.switch() [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1292.745913] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return self.greenlet.switch() [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self.f(*self.args, **self.kw) [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] raise exceptions.translate_fault(task_info.error) [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Faults: ['InvalidArgument'] [ 1292.746274] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] [ 1292.747200] env[68492]: INFO nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Terminating instance [ 1292.749058] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1292.749290] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1292.749968] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1292.750203] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1292.750436] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-95ac33bc-2f25-45e6-841b-52d20de3973b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.755390] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9f44bb7-e22f-49a0-b4e8-8ce5df25b730 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.762512] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1292.762754] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bbcb21f0-7cb9-4c31-8315-5d9abca02005 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.764953] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1292.765143] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1292.766147] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe2cdf37-82f3-4827-96e7-308bd249ccc9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.771111] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for the task: (returnval){ [ 1292.771111] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52220121-295e-050d-e4ef-9d2a4e1332b7" [ 1292.771111] env[68492]: _type = "Task" [ 1292.771111] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1292.779825] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52220121-295e-050d-e4ef-9d2a4e1332b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1292.826561] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1292.826789] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1292.826957] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleting the datastore file [datastore2] 3b1ce4e1-bbad-4030-84d9-f814a44eec4a {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1292.827210] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d2b0b48a-e03f-4fe5-ad5d-540e1debb505 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1292.835140] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1292.835140] env[68492]: value = "task-3395464" [ 1292.835140] env[68492]: _type = "Task" [ 1292.835140] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1292.843083] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395464, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1293.282435] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1293.282709] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Creating directory with path [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1293.282947] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29e7db8e-c695-42b8-b9b6-61d67cedd780 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.294551] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Created directory with path [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1293.294738] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Fetch image to [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1293.294911] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1293.295654] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b539211-4aa0-4315-ab99-6feb5a646caa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.302142] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ee15cf2-9303-457e-9e42-e09b7b01878c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.311152] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe26a2a-2eb0-48cb-ace8-ff24ba4af89f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.343266] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ec983f-ae9c-4b1a-ad01-210c3839fa02 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.350085] env[68492]: DEBUG oslo_vmware.api [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395464, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068597} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1293.351543] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1293.351739] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1293.351917] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1293.352107] env[68492]: INFO nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1293.353850] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5bc85b26-b65c-49b1-b662-befce41f8aa2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.355734] env[68492]: DEBUG nova.compute.claims [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1293.355932] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1293.356170] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1293.381179] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1293.442407] env[68492]: DEBUG oslo_vmware.rw_handles [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1293.503695] env[68492]: DEBUG oslo_vmware.rw_handles [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1293.503695] env[68492]: DEBUG oslo_vmware.rw_handles [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1293.677238] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dcc2fab-fff5-486b-be1a-5677c0f829e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.685863] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a98d7f0-074e-4b03-a48c-27141f77181d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.714870] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db0add86-931e-45b2-888f-62e632a2a341 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.721806] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f035d805-be22-4a4b-8185-89d6a9c42f86 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.734238] env[68492]: DEBUG nova.compute.provider_tree [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1293.742604] env[68492]: DEBUG nova.scheduler.client.report [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1293.756794] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.401s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1293.757355] env[68492]: ERROR nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1293.757355] env[68492]: Faults: ['InvalidArgument'] [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Traceback (most recent call last): [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self.driver.spawn(context, instance, image_meta, [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self._fetch_image_if_missing(context, vi) [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] image_cache(vi, tmp_image_ds_loc) [ 1293.757355] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] vm_util.copy_virtual_disk( [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] session._wait_for_task(vmdk_copy_task) [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return self.wait_for_task(task_ref) [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return evt.wait() [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] result = hub.switch() [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] return self.greenlet.switch() [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1293.757653] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] self.f(*self.args, **self.kw) [ 1293.757944] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1293.757944] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] raise exceptions.translate_fault(task_info.error) [ 1293.757944] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1293.757944] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Faults: ['InvalidArgument'] [ 1293.757944] env[68492]: ERROR nova.compute.manager [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] [ 1293.758130] env[68492]: DEBUG nova.compute.utils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1293.759360] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Build of instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a was re-scheduled: A specified parameter was not correct: fileType [ 1293.759360] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1293.759723] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1293.759910] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1293.760091] env[68492]: DEBUG nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1293.760286] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1294.248595] env[68492]: DEBUG nova.network.neutron [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1294.259200] env[68492]: INFO nova.compute.manager [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Took 0.50 seconds to deallocate network for instance. [ 1294.358023] env[68492]: INFO nova.scheduler.client.report [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleted allocations for instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a [ 1294.383161] env[68492]: DEBUG oslo_concurrency.lockutils [None req-83393aa1-c010-42b4-99f3-59c41308901a tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.183s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.386020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 439.693s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.386020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.386020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.386219] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.387500] env[68492]: INFO nova.compute.manager [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Terminating instance [ 1294.389382] env[68492]: DEBUG nova.compute.manager [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1294.389732] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1294.390322] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-724cb0f5-e97a-4ed1-b722-67faabe960c6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.402055] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea4d23ea-e535-4b1b-8b94-e21974dffbed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.414141] env[68492]: DEBUG nova.compute.manager [None req-b53d8afa-057b-451a-8c4c-00ada6c4cc0a tempest-ServersTestJSON-1214267113 tempest-ServersTestJSON-1214267113-project-member] [instance: 49885647-f6a0-468a-bf58-206de779c896] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1294.434579] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3b1ce4e1-bbad-4030-84d9-f814a44eec4a could not be found. [ 1294.434792] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1294.434975] env[68492]: INFO nova.compute.manager [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1294.435242] env[68492]: DEBUG oslo.service.loopingcall [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1294.435475] env[68492]: DEBUG nova.compute.manager [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1294.435571] env[68492]: DEBUG nova.network.neutron [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1294.442586] env[68492]: DEBUG nova.compute.manager [None req-b53d8afa-057b-451a-8c4c-00ada6c4cc0a tempest-ServersTestJSON-1214267113 tempest-ServersTestJSON-1214267113-project-member] [instance: 49885647-f6a0-468a-bf58-206de779c896] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1294.461964] env[68492]: DEBUG nova.network.neutron [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1294.464029] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b53d8afa-057b-451a-8c4c-00ada6c4cc0a tempest-ServersTestJSON-1214267113 tempest-ServersTestJSON-1214267113-project-member] Lock "49885647-f6a0-468a-bf58-206de779c896" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.748s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.472454] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1294.500963] env[68492]: INFO nova.compute.manager [-] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] Took 0.07 seconds to deallocate network for instance. [ 1294.536609] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.536954] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.538597] env[68492]: INFO nova.compute.claims [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1294.594296] env[68492]: DEBUG oslo_concurrency.lockutils [None req-854c978e-8958-462a-8ffb-c180c2316a05 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.210s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.595276] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 134.350s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.595495] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 3b1ce4e1-bbad-4030-84d9-f814a44eec4a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1294.595767] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "3b1ce4e1-bbad-4030-84d9-f814a44eec4a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.808066] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f06fa0-8acd-4e71-8754-c713254d0dae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.815396] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f505eae-a0be-4f4c-8c21-bacb3dc78d76 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.845348] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f4038cf-33fb-4157-a957-f7cc9415ff57 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.852225] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d80f3dc-c88f-481d-97a6-9d6d8b3a6233 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.545558] env[68492]: DEBUG nova.compute.provider_tree [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1295.553677] env[68492]: DEBUG nova.scheduler.client.report [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1295.567377] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.030s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.567823] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1295.606091] env[68492]: DEBUG nova.compute.utils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1295.607430] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1295.607611] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1295.617521] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1295.662180] env[68492]: DEBUG nova.policy [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8cfdece4223d4903971a4508920f132d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '56e99974efcb4f4aa10d25bbb15f0dbd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1295.678126] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1295.702472] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1295.702707] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1295.702870] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1295.703066] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1295.703218] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1295.703360] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1295.703559] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1295.703717] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1295.703880] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1295.704052] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1295.704226] env[68492]: DEBUG nova.virt.hardware [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1295.705165] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40d09ee6-cd9f-4903-9146-afea4c63390c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.713197] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5da085a0-20f1-4f53-b7eb-af2070a908b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1296.021709] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Successfully created port: 5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1297.146242] env[68492]: DEBUG nova.compute.manager [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Received event network-vif-plugged-5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1297.146242] env[68492]: DEBUG oslo_concurrency.lockutils [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] Acquiring lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1297.146242] env[68492]: DEBUG oslo_concurrency.lockutils [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.146242] env[68492]: DEBUG oslo_concurrency.lockutils [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1297.146748] env[68492]: DEBUG nova.compute.manager [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] No waiting events found dispatching network-vif-plugged-5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1297.146748] env[68492]: WARNING nova.compute.manager [req-93a401f3-1c1a-442a-b9b7-dc65e202729e req-a8707a37-f48a-4d12-aaa7-2553c2e466f6 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Received unexpected event network-vif-plugged-5f84e4ad-8cb2-49e9-9819-637be8b7ff59 for instance with vm_state building and task_state spawning. [ 1297.198291] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Successfully updated port: 5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1297.213316] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1297.213316] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquired lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1297.213316] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1297.251909] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1297.505242] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Updating instance_info_cache with network_info: [{"id": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "address": "fa:16:3e:15:a0:bd", "network": {"id": "9bcde8ff-6430-4bd5-8c9d-3118ad494d49", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1389413544-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "56e99974efcb4f4aa10d25bbb15f0dbd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f84e4ad-8c", "ovs_interfaceid": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1297.520818] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Releasing lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1297.521114] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance network_info: |[{"id": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "address": "fa:16:3e:15:a0:bd", "network": {"id": "9bcde8ff-6430-4bd5-8c9d-3118ad494d49", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1389413544-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "56e99974efcb4f4aa10d25bbb15f0dbd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f84e4ad-8c", "ovs_interfaceid": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1297.521750] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:15:a0:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dcf5c3f7-4e33-4f21-b323-3673930b789c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5f84e4ad-8cb2-49e9-9819-637be8b7ff59', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1297.529207] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Creating folder: Project (56e99974efcb4f4aa10d25bbb15f0dbd). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1297.529702] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ffc9f86-fe78-45ad-9430-23e721724c81 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.540610] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Created folder: Project (56e99974efcb4f4aa10d25bbb15f0dbd) in parent group-v677434. [ 1297.540798] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Creating folder: Instances. Parent ref: group-v677513. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1297.541042] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-542463e9-f386-4992-896f-7f489f3d18c5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.550069] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Created folder: Instances in parent group-v677513. [ 1297.550371] env[68492]: DEBUG oslo.service.loopingcall [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1297.550556] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1297.551078] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b753219c-8a71-4256-82e0-71dea0a131be {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.569631] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1297.569631] env[68492]: value = "task-3395467" [ 1297.569631] env[68492]: _type = "Task" [ 1297.569631] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.578252] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395467, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.079566] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395467, 'name': CreateVM_Task, 'duration_secs': 0.423737} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1298.079745] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1298.080422] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1298.080595] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1298.080930] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1298.081203] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-42e2ca8f-de75-414b-88a1-2357daa43efc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.085680] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for the task: (returnval){ [ 1298.085680] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c8a883-fa58-e967-11e9-8e82bedf62ef" [ 1298.085680] env[68492]: _type = "Task" [ 1298.085680] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1298.093007] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c8a883-fa58-e967-11e9-8e82bedf62ef, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.596112] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1298.596476] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1298.596695] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1299.169346] env[68492]: DEBUG nova.compute.manager [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Received event network-changed-5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1299.169503] env[68492]: DEBUG nova.compute.manager [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Refreshing instance network info cache due to event network-changed-5f84e4ad-8cb2-49e9-9819-637be8b7ff59. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1299.169719] env[68492]: DEBUG oslo_concurrency.lockutils [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] Acquiring lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1299.169862] env[68492]: DEBUG oslo_concurrency.lockutils [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] Acquired lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1299.170033] env[68492]: DEBUG nova.network.neutron [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Refreshing network info cache for port 5f84e4ad-8cb2-49e9-9819-637be8b7ff59 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1299.235315] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1299.520443] env[68492]: DEBUG nova.network.neutron [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Updated VIF entry in instance network info cache for port 5f84e4ad-8cb2-49e9-9819-637be8b7ff59. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1299.520793] env[68492]: DEBUG nova.network.neutron [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Updating instance_info_cache with network_info: [{"id": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "address": "fa:16:3e:15:a0:bd", "network": {"id": "9bcde8ff-6430-4bd5-8c9d-3118ad494d49", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1389413544-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "56e99974efcb4f4aa10d25bbb15f0dbd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f84e4ad-8c", "ovs_interfaceid": "5f84e4ad-8cb2-49e9-9819-637be8b7ff59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1299.530050] env[68492]: DEBUG oslo_concurrency.lockutils [req-53779f08-6051-41b4-9d6c-f14ea52400be req-0b098e91-e7d2-4d52-abc3-8a06006f1d71 service nova] Releasing lock "refresh_cache-29bd5cc4-d884-4202-b503-74920a0b4ec5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1300.231240] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1300.231558] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1300.244359] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.244729] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.244825] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.245148] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1300.246606] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754ffe11-58b8-4f2f-8c98-c458ce5f1b51 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.249952] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.255888] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12152fab-7c59-4b76-8ccf-6c154be8c89f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.270172] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3f527f2-360a-4512-9566-0ef3c9465e1a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.276681] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d910ea9f-a324-4b24-8752-f8e610ebcb56 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.306548] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180962MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1300.306696] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.306878] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.381382] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 569b49ff-047a-4494-b869-6598764da9d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.381540] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.381670] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.381796] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.381919] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.382052] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.382175] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.382291] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.382408] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.382677] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1300.394389] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 5bec90ae-12e8-4620-ac96-76d82e123f7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.404901] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.415827] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9694688e-b937-4999-9b25-3caea82695b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.426325] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 51e8e546-2bd7-495b-a81d-a6cdc4dba99c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.436111] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 6a630f7b-3c45-42b2-b8ab-e93490cc1eb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.446857] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.456711] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 74853d33-dc81-497b-9af3-72973e20e60b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.466338] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f5dde0b2-1403-466c-aa23-a5573915256d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.476673] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1300.476935] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1300.477101] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1300.689266] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6d8ca62-7e78-4ccc-9dae-9b7ac3fffb63 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.696690] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-113efba4-d53f-4eee-8b8e-fe265eb0540a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.726308] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d9d7c9-7fac-43d4-ae9f-37c21705fa04 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.734218] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10253c4b-1e8e-4b31-b271-3ec494ec52a4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.746504] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1300.755556] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1300.775300] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1300.775491] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.469s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.775980] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1301.776271] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1301.776308] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1301.797549] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.797702] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.797830] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.797956] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798128] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798262] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798384] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798501] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798616] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798733] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1301.798851] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1303.230603] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.230904] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.230876] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.231502] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.231756] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.231917] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1342.368661] env[68492]: WARNING oslo_vmware.rw_handles [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1342.368661] env[68492]: ERROR oslo_vmware.rw_handles [ 1342.369440] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1342.371327] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1342.371636] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Copying Virtual Disk [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/b179c1b2-e0a1-4003-82b6-229d3a8c766a/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1342.371968] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7c0ffeec-c0f2-45bf-b998-923e0b1dd14e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.380328] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for the task: (returnval){ [ 1342.380328] env[68492]: value = "task-3395477" [ 1342.380328] env[68492]: _type = "Task" [ 1342.380328] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1342.388492] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Task: {'id': task-3395477, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1342.890664] env[68492]: DEBUG oslo_vmware.exceptions [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1342.890966] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1342.891534] env[68492]: ERROR nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1342.891534] env[68492]: Faults: ['InvalidArgument'] [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] Traceback (most recent call last): [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] yield resources [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self.driver.spawn(context, instance, image_meta, [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self._fetch_image_if_missing(context, vi) [ 1342.891534] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] image_cache(vi, tmp_image_ds_loc) [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] vm_util.copy_virtual_disk( [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] session._wait_for_task(vmdk_copy_task) [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return self.wait_for_task(task_ref) [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return evt.wait() [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] result = hub.switch() [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1342.891889] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return self.greenlet.switch() [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self.f(*self.args, **self.kw) [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] raise exceptions.translate_fault(task_info.error) [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] Faults: ['InvalidArgument'] [ 1342.892210] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] [ 1342.892210] env[68492]: INFO nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Terminating instance [ 1342.893400] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1342.893608] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1342.893848] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-368c66a3-0a4c-4a68-b268-56b0a78f9d1a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.896134] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1342.896351] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1342.897077] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e34f1ed5-cb82-409c-ae0a-32646b5cab85 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.905136] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1342.906160] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9abc41e8-0d76-421d-9fe7-3fd1bd5e30a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.907581] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1342.907753] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1342.908423] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-502de96a-4b64-4b9a-8744-35ed76c4e04f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.913347] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for the task: (returnval){ [ 1342.913347] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]523f6934-0ec4-037d-3f55-d1dba3b852e7" [ 1342.913347] env[68492]: _type = "Task" [ 1342.913347] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1342.920595] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]523f6934-0ec4-037d-3f55-d1dba3b852e7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1342.978439] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1342.978785] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1342.979079] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Deleting the datastore file [datastore2] 569b49ff-047a-4494-b869-6598764da9d7 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1342.979419] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5efcd586-5808-48e5-bfec-13fe5e860a60 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1342.985924] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for the task: (returnval){ [ 1342.985924] env[68492]: value = "task-3395480" [ 1342.985924] env[68492]: _type = "Task" [ 1342.985924] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1342.993745] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Task: {'id': task-3395480, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1343.423364] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1343.423731] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Creating directory with path [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1343.423852] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7735371-5c4f-4acb-a7d0-580df91c2183 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.435517] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Created directory with path [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1343.435691] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Fetch image to [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1343.435853] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1343.436598] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-363af8d2-6e3b-4fc1-b9e0-873a9a738d7f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.442964] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9fb9a33-fc7a-4034-956e-36df1d811c25 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.452195] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abfcd371-27d5-4aef-8cd1-dd0423ef29ce {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.482566] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62e38360-b601-43ad-a824-a3991e58994e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.490917] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4dba8d0a-0ce4-469c-b765-bc8e1d38a0af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.495200] env[68492]: DEBUG oslo_vmware.api [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Task: {'id': task-3395480, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.091117} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1343.495709] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1343.495900] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1343.496109] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1343.496294] env[68492]: INFO nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1343.498339] env[68492]: DEBUG nova.compute.claims [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1343.498503] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1343.498706] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1343.513097] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1343.675444] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1343.743119] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1343.743361] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1343.842962] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1509f82-6344-47a5-beea-a77dff63a30d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.852063] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37bbfecc-602c-4d49-8eb0-0d02bd77052b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.883810] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb12e856-6542-44c2-b31f-c22738d0f6c0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.891717] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bff7d8f5-2839-4857-bc87-cc456d500ed8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1343.905924] env[68492]: DEBUG nova.compute.provider_tree [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1343.914524] env[68492]: DEBUG nova.scheduler.client.report [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1343.929896] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.431s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1343.930499] env[68492]: ERROR nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1343.930499] env[68492]: Faults: ['InvalidArgument'] [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] Traceback (most recent call last): [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self.driver.spawn(context, instance, image_meta, [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self._fetch_image_if_missing(context, vi) [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] image_cache(vi, tmp_image_ds_loc) [ 1343.930499] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] vm_util.copy_virtual_disk( [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] session._wait_for_task(vmdk_copy_task) [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return self.wait_for_task(task_ref) [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return evt.wait() [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] result = hub.switch() [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] return self.greenlet.switch() [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1343.930835] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] self.f(*self.args, **self.kw) [ 1343.931186] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1343.931186] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] raise exceptions.translate_fault(task_info.error) [ 1343.931186] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1343.931186] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] Faults: ['InvalidArgument'] [ 1343.931186] env[68492]: ERROR nova.compute.manager [instance: 569b49ff-047a-4494-b869-6598764da9d7] [ 1343.931186] env[68492]: DEBUG nova.compute.utils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1343.932975] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Build of instance 569b49ff-047a-4494-b869-6598764da9d7 was re-scheduled: A specified parameter was not correct: fileType [ 1343.932975] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1343.933383] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1343.933536] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1343.933709] env[68492]: DEBUG nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1343.933871] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1344.340407] env[68492]: DEBUG nova.network.neutron [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1344.351634] env[68492]: INFO nova.compute.manager [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Took 0.42 seconds to deallocate network for instance. [ 1344.442106] env[68492]: INFO nova.scheduler.client.report [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Deleted allocations for instance 569b49ff-047a-4494-b869-6598764da9d7 [ 1344.464914] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e117a483-85f3-4160-9438-783ae3e42b52 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 639.086s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.466164] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 442.966s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1344.466407] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Acquiring lock "569b49ff-047a-4494-b869-6598764da9d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1344.466622] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1344.466787] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.468676] env[68492]: INFO nova.compute.manager [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Terminating instance [ 1344.470337] env[68492]: DEBUG nova.compute.manager [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1344.470529] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1344.471185] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c533803-7cad-4ab3-8be2-e131487e904c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.478651] env[68492]: DEBUG nova.compute.manager [None req-98361909-4d0e-4405-ae94-2821eeeea069 tempest-InstanceActionsNegativeTestJSON-1912133732 tempest-InstanceActionsNegativeTestJSON-1912133732-project-member] [instance: 5bec90ae-12e8-4620-ac96-76d82e123f7d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1344.484983] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f1f507a-19ae-4eed-acb7-dedf976df36e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.502797] env[68492]: DEBUG nova.compute.manager [None req-98361909-4d0e-4405-ae94-2821eeeea069 tempest-InstanceActionsNegativeTestJSON-1912133732 tempest-InstanceActionsNegativeTestJSON-1912133732-project-member] [instance: 5bec90ae-12e8-4620-ac96-76d82e123f7d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1344.514557] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 569b49ff-047a-4494-b869-6598764da9d7 could not be found. [ 1344.514747] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1344.514919] env[68492]: INFO nova.compute.manager [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1344.515171] env[68492]: DEBUG oslo.service.loopingcall [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1344.515573] env[68492]: DEBUG nova.compute.manager [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1344.515672] env[68492]: DEBUG nova.network.neutron [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1344.531466] env[68492]: DEBUG oslo_concurrency.lockutils [None req-98361909-4d0e-4405-ae94-2821eeeea069 tempest-InstanceActionsNegativeTestJSON-1912133732 tempest-InstanceActionsNegativeTestJSON-1912133732-project-member] Lock "5bec90ae-12e8-4620-ac96-76d82e123f7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.510s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.538973] env[68492]: DEBUG nova.network.neutron [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1344.546612] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1344.549613] env[68492]: INFO nova.compute.manager [-] [instance: 569b49ff-047a-4494-b869-6598764da9d7] Took 0.03 seconds to deallocate network for instance. [ 1344.593807] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1344.594073] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1344.595441] env[68492]: INFO nova.compute.claims [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1344.630795] env[68492]: DEBUG oslo_concurrency.lockutils [None req-ecf664db-93e6-4ccd-b163-e9c8d181a235 tempest-ServerAddressesNegativeTestJSON-505752412 tempest-ServerAddressesNegativeTestJSON-505752412-project-member] Lock "569b49ff-047a-4494-b869-6598764da9d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.631643] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "569b49ff-047a-4494-b869-6598764da9d7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 184.386s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1344.631835] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 569b49ff-047a-4494-b869-6598764da9d7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1344.632013] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "569b49ff-047a-4494-b869-6598764da9d7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.834423] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b0b33c4-50b0-435c-9051-6b4edf2ed841 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.841630] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee1ad701-ba23-4524-8284-539f88f02ddf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.872176] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e86d00ec-0075-4e68-af63-b12b1ee28dbe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.879210] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-094f7862-4026-4392-82a4-3acf92901cc2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1344.892111] env[68492]: DEBUG nova.compute.provider_tree [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1344.900691] env[68492]: DEBUG nova.scheduler.client.report [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1344.917958] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1344.918493] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1344.953827] env[68492]: DEBUG nova.compute.utils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1344.956304] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1344.956842] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1344.969262] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1345.035313] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1345.060832] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1345.061091] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1345.061256] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1345.061433] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1345.061576] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1345.061717] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1345.061919] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1345.062090] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1345.062260] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1345.062421] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1345.062592] env[68492]: DEBUG nova.virt.hardware [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1345.063476] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc0788e-afcb-4f93-ac62-9ea6194ddf4a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1345.071084] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33596e17-a6da-44ac-8f3b-89ba9124cdfe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1345.241333] env[68492]: DEBUG nova.policy [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7bf86f7359545ebbf45a5a002c88e5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839d10b6a7894af08ca3717477bcd473', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1345.875891] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Successfully created port: c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1346.489887] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "40087617-1982-4727-ac78-1cb6437b11c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1346.489887] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1346.947307] env[68492]: DEBUG nova.compute.manager [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Received event network-vif-plugged-c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1346.947307] env[68492]: DEBUG oslo_concurrency.lockutils [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] Acquiring lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1346.947307] env[68492]: DEBUG oslo_concurrency.lockutils [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1346.947831] env[68492]: DEBUG oslo_concurrency.lockutils [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1346.948167] env[68492]: DEBUG nova.compute.manager [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] No waiting events found dispatching network-vif-plugged-c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1346.948540] env[68492]: WARNING nova.compute.manager [req-06e37f49-97a3-411b-b755-23b67a35aa1e req-f85931c7-e06c-4e9d-a347-43c88ac785df service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Received unexpected event network-vif-plugged-c21efef1-eac9-42db-b2fa-bdb4b466bda3 for instance with vm_state building and task_state spawning. [ 1347.036467] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Successfully updated port: c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1347.051020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1347.051020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1347.051020] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1347.312370] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1347.760711] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Updating instance_info_cache with network_info: [{"id": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "address": "fa:16:3e:63:42:25", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc21efef1-ea", "ovs_interfaceid": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1347.773514] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1347.773809] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance network_info: |[{"id": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "address": "fa:16:3e:63:42:25", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc21efef1-ea", "ovs_interfaceid": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1347.774294] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:63:42:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '310b8ba9-edca-4135-863e-f4a786dd4a77', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c21efef1-eac9-42db-b2fa-bdb4b466bda3', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1347.782885] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating folder: Project (839d10b6a7894af08ca3717477bcd473). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1347.783503] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-319171ce-8634-490d-a173-c00d3791a587 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.794369] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created folder: Project (839d10b6a7894af08ca3717477bcd473) in parent group-v677434. [ 1347.794573] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating folder: Instances. Parent ref: group-v677520. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1347.794813] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a5e21efe-f6cf-4e0e-8838-5cc71d3b1a89 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.802946] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created folder: Instances in parent group-v677520. [ 1347.803245] env[68492]: DEBUG oslo.service.loopingcall [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1347.803375] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1347.803567] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-20994b8f-9412-421c-ac69-0b4da017d1f6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1347.823531] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1347.823531] env[68492]: value = "task-3395483" [ 1347.823531] env[68492]: _type = "Task" [ 1347.823531] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1347.830644] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395483, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1348.332700] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395483, 'name': CreateVM_Task, 'duration_secs': 0.27109} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1348.333107] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1348.333585] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1348.333744] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1348.334061] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1348.334308] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-68462cc0-06eb-44d6-8f7c-ad0a9bd4d499 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.339072] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1348.339072] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522c4be0-2fa4-9c3b-0092-70b737db5650" [ 1348.339072] env[68492]: _type = "Task" [ 1348.339072] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.346952] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522c4be0-2fa4-9c3b-0092-70b737db5650, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1348.849114] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1348.849328] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1348.849537] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1349.136311] env[68492]: DEBUG nova.compute.manager [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Received event network-changed-c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1349.136503] env[68492]: DEBUG nova.compute.manager [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Refreshing instance network info cache due to event network-changed-c21efef1-eac9-42db-b2fa-bdb4b466bda3. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1349.136711] env[68492]: DEBUG oslo_concurrency.lockutils [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] Acquiring lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1349.136850] env[68492]: DEBUG oslo_concurrency.lockutils [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] Acquired lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1349.137014] env[68492]: DEBUG nova.network.neutron [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Refreshing network info cache for port c21efef1-eac9-42db-b2fa-bdb4b466bda3 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1349.553912] env[68492]: DEBUG nova.network.neutron [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Updated VIF entry in instance network info cache for port c21efef1-eac9-42db-b2fa-bdb4b466bda3. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1349.554333] env[68492]: DEBUG nova.network.neutron [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Updating instance_info_cache with network_info: [{"id": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "address": "fa:16:3e:63:42:25", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc21efef1-ea", "ovs_interfaceid": "c21efef1-eac9-42db-b2fa-bdb4b466bda3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1349.571089] env[68492]: DEBUG oslo_concurrency.lockutils [req-913b9aa1-8086-407f-8264-e473b464bd1c req-4bd818d5-0527-4ad5-a62c-92c9faa7f123 service nova] Releasing lock "refresh_cache-4a7172f0-050f-4040-b974-91ce9ac96a0d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1354.807130] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1354.807435] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.225934] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1360.249360] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1360.249551] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1362.230613] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1362.242675] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.242853] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.243028] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.243188] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1362.244289] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60628b55-f74b-4b7d-a26c-a14659965101 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.253043] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b46d6397-8997-4d01-8156-be338b818897 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.266730] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b60944f-d5eb-4aaa-a718-26a5bf810da5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.273566] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99c3b37d-5d71-4378-b72c-c1876bf1d1cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.303730] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180962MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1362.303889] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.304117] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.377100] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8c72085d-697c-4829-866a-4d642f18d2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377285] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377416] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377540] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377660] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377782] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.377901] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.378030] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.378173] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.378347] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1362.390814] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 9694688e-b937-4999-9b25-3caea82695b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.401361] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 51e8e546-2bd7-495b-a81d-a6cdc4dba99c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.412083] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 6a630f7b-3c45-42b2-b8ab-e93490cc1eb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.422029] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.431301] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 74853d33-dc81-497b-9af3-72973e20e60b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.440492] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f5dde0b2-1403-466c-aa23-a5573915256d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.449462] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.458562] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 40087617-1982-4727-ac78-1cb6437b11c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.467795] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1362.468039] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1362.468196] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1362.674679] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5927e46-0b61-45c8-8f31-435490ffc059 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.682314] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4a199cd-2f76-4a13-9d4d-6f768a82b33d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.712127] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e8acf81-4345-4ecd-9507-0ea429bf3103 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.719056] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f7f76ff-eff0-4646-9bed-10e2da5fd8f3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.731834] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1362.740490] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1362.753893] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1362.754119] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.450s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1363.754615] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.754860] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1363.754941] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1363.776631] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.776751] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.776866] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.776989] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777131] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777298] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777434] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777554] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777672] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777788] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1363.777908] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1363.778425] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.250337] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.231898] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.232244] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.232383] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1369.231905] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1376.078057] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.078354] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.673773] env[68492]: DEBUG oslo_concurrency.lockutils [None req-86bf1bc2-b937-4b3d-ba2b-cc6780921a49 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "2ffaadba-8144-4c60-b055-95619cd75024" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.673773] env[68492]: DEBUG oslo_concurrency.lockutils [None req-86bf1bc2-b937-4b3d-ba2b-cc6780921a49 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "2ffaadba-8144-4c60-b055-95619cd75024" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.985614] env[68492]: DEBUG oslo_concurrency.lockutils [None req-564de4ee-9385-4996-b9a1-651a0a78f64d tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "0b8f7208-aba6-4411-9ce1-1493367220b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.986078] env[68492]: DEBUG oslo_concurrency.lockutils [None req-564de4ee-9385-4996-b9a1-651a0a78f64d tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "0b8f7208-aba6-4411-9ce1-1493367220b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1389.880868] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1390.614538] env[68492]: WARNING oslo_vmware.rw_handles [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1390.614538] env[68492]: ERROR oslo_vmware.rw_handles [ 1390.614938] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1390.616890] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1390.617149] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Copying Virtual Disk [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/a028efc2-950f-47e9-afba-237c64684054/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1390.617458] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d277d3ab-1a86-4b8a-9ed6-761ca42489e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1390.625722] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for the task: (returnval){ [ 1390.625722] env[68492]: value = "task-3395484" [ 1390.625722] env[68492]: _type = "Task" [ 1390.625722] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1390.633740] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Task: {'id': task-3395484, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1391.136431] env[68492]: DEBUG oslo_vmware.exceptions [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1391.136764] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1391.137340] env[68492]: ERROR nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1391.137340] env[68492]: Faults: ['InvalidArgument'] [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Traceback (most recent call last): [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] yield resources [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self.driver.spawn(context, instance, image_meta, [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self._fetch_image_if_missing(context, vi) [ 1391.137340] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] image_cache(vi, tmp_image_ds_loc) [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] vm_util.copy_virtual_disk( [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] session._wait_for_task(vmdk_copy_task) [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return self.wait_for_task(task_ref) [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return evt.wait() [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] result = hub.switch() [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1391.137878] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return self.greenlet.switch() [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self.f(*self.args, **self.kw) [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] raise exceptions.translate_fault(task_info.error) [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Faults: ['InvalidArgument'] [ 1391.138250] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] [ 1391.138250] env[68492]: INFO nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Terminating instance [ 1391.139213] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1391.139441] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1391.140157] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1391.140348] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1391.140581] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7282075a-0f2d-4183-974b-3d604549df1d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.143081] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06f64eb7-8b8e-4c7a-997d-e0939cd69d39 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.149826] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1391.150066] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-34124108-0973-4b1c-8f1d-960570f7d723 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.152191] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1391.152366] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1391.153330] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a065ec25-2773-44eb-bb47-08c12e9718a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.158147] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for the task: (returnval){ [ 1391.158147] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e313c4-9547-5638-a648-fb1ac60e48e8" [ 1391.158147] env[68492]: _type = "Task" [ 1391.158147] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1391.165395] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e313c4-9547-5638-a648-fb1ac60e48e8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1391.221889] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1391.222144] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1391.222322] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Deleting the datastore file [datastore2] 8c72085d-697c-4829-866a-4d642f18d2f6 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1391.222585] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-504aeffc-39ff-4137-bf78-3ce4c38f3951 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.229101] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for the task: (returnval){ [ 1391.229101] env[68492]: value = "task-3395486" [ 1391.229101] env[68492]: _type = "Task" [ 1391.229101] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1391.237387] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Task: {'id': task-3395486, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1391.668230] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1391.668490] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Creating directory with path [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1391.668733] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1a9e996-d10d-4780-a179-28a97a25154d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.679793] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Created directory with path [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1391.679987] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Fetch image to [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1391.680173] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1391.680888] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa80a82-9eb5-486a-9d24-71efa8af8778 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.687226] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b148a5-a458-4db0-8ac6-9118d7fd32ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.696065] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d47a391f-eed5-455b-9765-76ef06e3b572 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.727285] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeeb79a4-a129-4c3a-b0ee-c84722b3a391 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.737734] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cac744e2-c459-4732-b0e2-b46b9ac3e372 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.739359] env[68492]: DEBUG oslo_vmware.api [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Task: {'id': task-3395486, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075537} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1391.739594] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1391.739769] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1391.739931] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1391.740114] env[68492]: INFO nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1391.742162] env[68492]: DEBUG nova.compute.claims [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1391.742326] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1391.742583] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1391.774844] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1391.826275] env[68492]: DEBUG oslo_vmware.rw_handles [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1391.888699] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1391.888938] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1391.889405] env[68492]: DEBUG oslo_vmware.rw_handles [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1391.889581] env[68492]: DEBUG oslo_vmware.rw_handles [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1392.068828] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dcb2440-7ae1-47af-92d5-2944e7f19089 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.076467] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47aaed4d-667b-44d3-97a1-95699d655eb1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.105492] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-443c5c0c-23c9-429f-b088-dd5388447497 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.112258] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d45f175a-2e7d-4656-b4d1-46b34d42cd03 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.126065] env[68492]: DEBUG nova.compute.provider_tree [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1392.134760] env[68492]: DEBUG nova.scheduler.client.report [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1392.148126] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.405s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1392.148692] env[68492]: ERROR nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1392.148692] env[68492]: Faults: ['InvalidArgument'] [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Traceback (most recent call last): [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self.driver.spawn(context, instance, image_meta, [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self._fetch_image_if_missing(context, vi) [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] image_cache(vi, tmp_image_ds_loc) [ 1392.148692] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] vm_util.copy_virtual_disk( [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] session._wait_for_task(vmdk_copy_task) [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return self.wait_for_task(task_ref) [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return evt.wait() [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] result = hub.switch() [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] return self.greenlet.switch() [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1392.149011] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] self.f(*self.args, **self.kw) [ 1392.149341] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1392.149341] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] raise exceptions.translate_fault(task_info.error) [ 1392.149341] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1392.149341] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Faults: ['InvalidArgument'] [ 1392.149341] env[68492]: ERROR nova.compute.manager [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] [ 1392.149468] env[68492]: DEBUG nova.compute.utils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1392.150905] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Build of instance 8c72085d-697c-4829-866a-4d642f18d2f6 was re-scheduled: A specified parameter was not correct: fileType [ 1392.150905] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1392.151286] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1392.151451] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1392.151616] env[68492]: DEBUG nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1392.151769] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1392.674429] env[68492]: DEBUG nova.network.neutron [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1392.692050] env[68492]: INFO nova.compute.manager [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Took 0.54 seconds to deallocate network for instance. [ 1392.838326] env[68492]: INFO nova.scheduler.client.report [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Deleted allocations for instance 8c72085d-697c-4829-866a-4d642f18d2f6 [ 1392.861971] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92822b0f-6b4c-466b-a084-608ab40e7978 tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 596.452s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1392.864418] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 399.756s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1392.864418] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Acquiring lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1392.864418] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1392.864695] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1392.865511] env[68492]: INFO nova.compute.manager [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Terminating instance [ 1392.867172] env[68492]: DEBUG nova.compute.manager [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1392.867752] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1392.867910] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5601e0ad-5089-4600-8bbb-90898cc445fa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.876738] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52029186-0100-4b53-b30c-e7398a0db752 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.887257] env[68492]: DEBUG nova.compute.manager [None req-27765813-a0ad-45bd-9761-047a220ae9aa tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] [instance: 9694688e-b937-4999-9b25-3caea82695b3] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1392.906947] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8c72085d-697c-4829-866a-4d642f18d2f6 could not be found. [ 1392.907545] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1392.907545] env[68492]: INFO nova.compute.manager [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1392.907664] env[68492]: DEBUG oslo.service.loopingcall [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1392.907834] env[68492]: DEBUG nova.compute.manager [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1392.907929] env[68492]: DEBUG nova.network.neutron [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1392.930801] env[68492]: DEBUG nova.network.neutron [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1392.935860] env[68492]: DEBUG nova.compute.manager [None req-27765813-a0ad-45bd-9761-047a220ae9aa tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] [instance: 9694688e-b937-4999-9b25-3caea82695b3] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1392.939027] env[68492]: INFO nova.compute.manager [-] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] Took 0.03 seconds to deallocate network for instance. [ 1392.955249] env[68492]: DEBUG oslo_concurrency.lockutils [None req-27765813-a0ad-45bd-9761-047a220ae9aa tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Lock "9694688e-b937-4999-9b25-3caea82695b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.790s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1392.963366] env[68492]: DEBUG nova.compute.manager [None req-eb53b3dc-3a8e-4491-8226-264b6e926874 tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] [instance: 51e8e546-2bd7-495b-a81d-a6cdc4dba99c] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1392.984844] env[68492]: DEBUG nova.compute.manager [None req-eb53b3dc-3a8e-4491-8226-264b6e926874 tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] [instance: 51e8e546-2bd7-495b-a81d-a6cdc4dba99c] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1393.004269] env[68492]: DEBUG oslo_concurrency.lockutils [None req-eb53b3dc-3a8e-4491-8226-264b6e926874 tempest-ServerShowV247Test-1088349381 tempest-ServerShowV247Test-1088349381-project-member] Lock "51e8e546-2bd7-495b-a81d-a6cdc4dba99c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.415s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1393.020204] env[68492]: DEBUG nova.compute.manager [None req-8d7cb0f9-a084-482b-9860-8d9014b0127f tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 6a630f7b-3c45-42b2-b8ab-e93490cc1eb3] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1393.027754] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e8496424-4feb-42c4-99c7-efcbfdaa653c tempest-VolumesAdminNegativeTest-447821136 tempest-VolumesAdminNegativeTest-447821136-project-member] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1393.029684] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 232.784s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1393.029867] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 8c72085d-697c-4829-866a-4d642f18d2f6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1393.030050] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "8c72085d-697c-4829-866a-4d642f18d2f6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1393.044537] env[68492]: DEBUG nova.compute.manager [None req-8d7cb0f9-a084-482b-9860-8d9014b0127f tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 6a630f7b-3c45-42b2-b8ab-e93490cc1eb3] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1393.063408] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8d7cb0f9-a084-482b-9860-8d9014b0127f tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "6a630f7b-3c45-42b2-b8ab-e93490cc1eb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.463s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1393.073200] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1393.118367] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1393.118626] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1393.120061] env[68492]: INFO nova.compute.claims [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1393.371966] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c00e5ec5-368d-4f6d-ad99-08026f76d12e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1393.380008] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a00cf196-fb8b-45b1-a32d-a03c2e185c0a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1393.409208] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86b1150c-4dc9-452b-9536-355e0a3a06d3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1393.415909] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44d859fe-04d8-4dc0-b468-b0bd67e4cace {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1393.431092] env[68492]: DEBUG nova.compute.provider_tree [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1393.440455] env[68492]: DEBUG nova.scheduler.client.report [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1393.456207] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1393.456696] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1393.490765] env[68492]: DEBUG nova.compute.utils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1393.492141] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1393.492358] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1393.503231] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1393.558691] env[68492]: DEBUG nova.policy [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '568ab24cbb7d4833bb8cdfd51db89db5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80fa34aee50b4509a18abca39075924a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1393.566072] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1393.590054] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1393.590340] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1393.590533] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1393.590776] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1393.590926] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1393.591112] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1393.591368] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1393.591559] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1393.591754] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1393.591924] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1393.592108] env[68492]: DEBUG nova.virt.hardware [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1393.592940] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64405fae-93ff-4748-9cac-74ed48413305 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1393.600999] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f7376fc-f5c5-4b53-9a67-132f31d7c5ee {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1394.264944] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Successfully created port: 43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1394.937459] env[68492]: DEBUG nova.compute.manager [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Received event network-vif-plugged-43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1394.937733] env[68492]: DEBUG oslo_concurrency.lockutils [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] Acquiring lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1394.937973] env[68492]: DEBUG oslo_concurrency.lockutils [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1394.938334] env[68492]: DEBUG oslo_concurrency.lockutils [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1394.938600] env[68492]: DEBUG nova.compute.manager [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] No waiting events found dispatching network-vif-plugged-43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1394.938864] env[68492]: WARNING nova.compute.manager [req-68cabd7d-b44f-4208-9aa8-c649fb3e127b req-779fba52-c7e5-436f-a1ae-9a17ff6e5cf5 service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Received unexpected event network-vif-plugged-43cb18d5-c6a7-49ba-84f5-477cd6bcca2b for instance with vm_state building and task_state spawning. [ 1395.069985] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Successfully updated port: 43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1395.084184] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1395.084421] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1395.084632] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1395.161237] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1395.436995] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Updating instance_info_cache with network_info: [{"id": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "address": "fa:16:3e:39:de:a1", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43cb18d5-c6", "ovs_interfaceid": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1395.453063] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1395.453366] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance network_info: |[{"id": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "address": "fa:16:3e:39:de:a1", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43cb18d5-c6", "ovs_interfaceid": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1395.453761] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:39:de:a1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35342bcb-8b06-472e-b3c0-43fd3d6c4b30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43cb18d5-c6a7-49ba-84f5-477cd6bcca2b', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1395.461556] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating folder: Project (80fa34aee50b4509a18abca39075924a). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1395.462330] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-797f7f8e-20c4-42ef-b6aa-236a5de4ddbe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1395.474089] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created folder: Project (80fa34aee50b4509a18abca39075924a) in parent group-v677434. [ 1395.474287] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating folder: Instances. Parent ref: group-v677523. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1395.474533] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c8451dbd-83c5-44fa-868b-d5086b5a107e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1395.482859] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created folder: Instances in parent group-v677523. [ 1395.483107] env[68492]: DEBUG oslo.service.loopingcall [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1395.483294] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1395.483489] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-422a728a-3421-4c8f-9bda-991bfd858429 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1395.504130] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1395.504130] env[68492]: value = "task-3395489" [ 1395.504130] env[68492]: _type = "Task" [ 1395.504130] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1395.512668] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395489, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1395.659526] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1396.015615] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395489, 'name': CreateVM_Task, 'duration_secs': 0.31364} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1396.015918] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1396.016481] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1396.016715] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1396.017093] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1396.017337] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88df9064-335e-4259-97cf-fd14d8fdfe8e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1396.022322] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 1396.022322] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]526f77ed-3fcf-4b94-07f7-4834361716c7" [ 1396.022322] env[68492]: _type = "Task" [ 1396.022322] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1396.032110] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]526f77ed-3fcf-4b94-07f7-4834361716c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1396.532947] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1396.532947] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1396.533267] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1396.966852] env[68492]: DEBUG nova.compute.manager [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Received event network-changed-43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1396.967212] env[68492]: DEBUG nova.compute.manager [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Refreshing instance network info cache due to event network-changed-43cb18d5-c6a7-49ba-84f5-477cd6bcca2b. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1396.967563] env[68492]: DEBUG oslo_concurrency.lockutils [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] Acquiring lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1396.967816] env[68492]: DEBUG oslo_concurrency.lockutils [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] Acquired lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1396.968069] env[68492]: DEBUG nova.network.neutron [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Refreshing network info cache for port 43cb18d5-c6a7-49ba-84f5-477cd6bcca2b {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1397.204709] env[68492]: DEBUG nova.network.neutron [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Updated VIF entry in instance network info cache for port 43cb18d5-c6a7-49ba-84f5-477cd6bcca2b. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1397.205072] env[68492]: DEBUG nova.network.neutron [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Updating instance_info_cache with network_info: [{"id": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "address": "fa:16:3e:39:de:a1", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43cb18d5-c6", "ovs_interfaceid": "43cb18d5-c6a7-49ba-84f5-477cd6bcca2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1397.214507] env[68492]: DEBUG oslo_concurrency.lockutils [req-c11d6808-decd-4b2a-a0de-c681231c1ccf req-d5260613-5da8-492a-9884-6794fd2b670c service nova] Releasing lock "refresh_cache-fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1421.232351] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1422.231565] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.231964] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.232249] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1423.232284] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1423.257325] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.257508] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.257677] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.257739] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.257860] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.257985] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.258121] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.258240] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.258368] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.258494] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1423.258616] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1423.259165] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.259347] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.272334] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1423.272534] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1423.272743] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1423.272907] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1423.273978] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eca58b8-c553-4de5-972b-dc939042be0f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.282973] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b6b6016-2f17-4017-9613-3332a453392e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.297102] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8c50802-23e0-45ad-9f1e-047077e8d5a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.303365] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f361a1c-7e47-4dfb-8492-07adcf7e811b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.335294] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180919MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1423.335294] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1423.335427] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1423.519148] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519262] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519358] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519484] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519606] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519727] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519846] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.519962] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.520091] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.520209] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1423.534024] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance f5dde0b2-1403-466c-aa23-a5573915256d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.544750] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.555728] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 40087617-1982-4727-ac78-1cb6437b11c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.566531] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.577744] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.587831] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2ffaadba-8144-4c60-b055-95619cd75024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.598412] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0b8f7208-aba6-4411-9ce1-1493367220b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.608123] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1423.608355] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1423.608507] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1423.625818] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1423.641897] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1423.642473] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1423.654897] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1423.675148] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1423.870036] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ca421f2-71c6-4496-bf81-037bffdfd75d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.877881] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885926a9-9c90-4909-8bf3-f4bf25bc902c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.907452] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2313f1-2497-4e9f-af4f-c3168d7072df {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.914156] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a41b04-b847-4a26-af77-85d119a07233 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1423.926984] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1423.935335] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1423.951164] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1423.951342] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.945312] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1427.008635] env[68492]: DEBUG oslo_concurrency.lockutils [None req-552b44ee-eb55-4493-bf5a-dac02867570d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Acquiring lock "8bf43303-71b9-4a37-acfd-1915196b71f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1427.009184] env[68492]: DEBUG oslo_concurrency.lockutils [None req-552b44ee-eb55-4493-bf5a-dac02867570d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "8bf43303-71b9-4a37-acfd-1915196b71f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1427.231128] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1427.231389] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1427.231538] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1429.230759] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1430.230715] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1430.232375] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1430.240021] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 0 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1430.240146] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.237882] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1435.238213] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1437.967775] env[68492]: WARNING oslo_vmware.rw_handles [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1437.967775] env[68492]: ERROR oslo_vmware.rw_handles [ 1437.968478] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1437.970139] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1437.970387] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Copying Virtual Disk [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/508c9cec-363b-47da-bc2b-84ed232f8331/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1437.970674] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f3732eab-f229-4f13-9335-715a088f2549 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1437.978990] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for the task: (returnval){ [ 1437.978990] env[68492]: value = "task-3395490" [ 1437.978990] env[68492]: _type = "Task" [ 1437.978990] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1437.986520] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Task: {'id': task-3395490, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1438.488319] env[68492]: DEBUG oslo_vmware.exceptions [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1438.488446] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1438.489011] env[68492]: ERROR nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1438.489011] env[68492]: Faults: ['InvalidArgument'] [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Traceback (most recent call last): [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] yield resources [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self.driver.spawn(context, instance, image_meta, [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self._fetch_image_if_missing(context, vi) [ 1438.489011] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] image_cache(vi, tmp_image_ds_loc) [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] vm_util.copy_virtual_disk( [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] session._wait_for_task(vmdk_copy_task) [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return self.wait_for_task(task_ref) [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return evt.wait() [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] result = hub.switch() [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1438.489459] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return self.greenlet.switch() [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self.f(*self.args, **self.kw) [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] raise exceptions.translate_fault(task_info.error) [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Faults: ['InvalidArgument'] [ 1438.489796] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] [ 1438.489796] env[68492]: INFO nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Terminating instance [ 1438.490859] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1438.491076] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1438.491318] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3ce5ce0e-bc40-4571-9439-561b0b94ec4f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.493659] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1438.493850] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1438.494575] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6789016-ecc6-401c-bfb7-a34ae2f02155 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.501292] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1438.501501] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1edcdb2b-afbe-448e-bf9a-f3d09582894e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.503630] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1438.503798] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1438.504732] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8673e650-fb8d-4f13-9198-f12d3cf0c1a1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.510504] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for the task: (returnval){ [ 1438.510504] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52140463-5c16-0bde-71cd-5eb01a760c63" [ 1438.510504] env[68492]: _type = "Task" [ 1438.510504] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1438.521869] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52140463-5c16-0bde-71cd-5eb01a760c63, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1438.576983] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1438.577253] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1438.577404] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Deleting the datastore file [datastore2] bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1438.577692] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8d69df14-3e20-4bd1-ac2f-1baa39102c79 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1438.587609] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for the task: (returnval){ [ 1438.587609] env[68492]: value = "task-3395492" [ 1438.587609] env[68492]: _type = "Task" [ 1438.587609] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1438.596529] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Task: {'id': task-3395492, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1439.020572] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1439.020920] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Creating directory with path [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1439.021025] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-95ffe93f-e86b-4434-9485-5c0879ac4b8a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.036606] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Created directory with path [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1439.036800] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Fetch image to [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1439.036970] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1439.037684] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66c60cf7-008f-4a45-b06c-9a5ddd42d6b5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.044044] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2424d819-c3b3-4e0e-988b-a955bf0a3047 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.052647] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ff3441c-448c-409d-98b8-085d6b7cd868 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.082519] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-537b69fb-3132-4164-a354-00f8d435092e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.088062] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2640f743-ea9a-4019-b7fd-dcbf9aca8d2c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.096486] env[68492]: DEBUG oslo_vmware.api [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Task: {'id': task-3395492, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.118994} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1439.096792] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1439.096987] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1439.097175] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1439.097345] env[68492]: INFO nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1439.099381] env[68492]: DEBUG nova.compute.claims [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1439.099595] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1439.099789] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1439.114164] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1439.162411] env[68492]: DEBUG oslo_vmware.rw_handles [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1439.222509] env[68492]: DEBUG oslo_vmware.rw_handles [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1439.222750] env[68492]: DEBUG oslo_vmware.rw_handles [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1439.409200] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecfb8e71-1c12-4673-afdc-e169cf6254cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.417191] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17969ee6-f31f-43c2-a452-94500d5e202f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.447050] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c5f394-265b-4670-818c-7b477f112627 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.455029] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84546a60-505f-47f7-b312-31d86179159c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.469866] env[68492]: DEBUG nova.compute.provider_tree [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1439.477220] env[68492]: DEBUG nova.scheduler.client.report [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1439.492314] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.392s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1439.492916] env[68492]: ERROR nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1439.492916] env[68492]: Faults: ['InvalidArgument'] [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Traceback (most recent call last): [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self.driver.spawn(context, instance, image_meta, [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self._fetch_image_if_missing(context, vi) [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] image_cache(vi, tmp_image_ds_loc) [ 1439.492916] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] vm_util.copy_virtual_disk( [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] session._wait_for_task(vmdk_copy_task) [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return self.wait_for_task(task_ref) [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return evt.wait() [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] result = hub.switch() [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] return self.greenlet.switch() [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1439.493255] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] self.f(*self.args, **self.kw) [ 1439.493570] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1439.493570] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] raise exceptions.translate_fault(task_info.error) [ 1439.493570] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1439.493570] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Faults: ['InvalidArgument'] [ 1439.493570] env[68492]: ERROR nova.compute.manager [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] [ 1439.493765] env[68492]: DEBUG nova.compute.utils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1439.495112] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Build of instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a was re-scheduled: A specified parameter was not correct: fileType [ 1439.495112] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1439.495451] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1439.495646] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1439.495785] env[68492]: DEBUG nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1439.496074] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1440.072529] env[68492]: DEBUG nova.network.neutron [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1440.085787] env[68492]: INFO nova.compute.manager [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Took 0.59 seconds to deallocate network for instance. [ 1440.194263] env[68492]: INFO nova.scheduler.client.report [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Deleted allocations for instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a [ 1440.223662] env[68492]: DEBUG oslo_concurrency.lockutils [None req-203d9eb9-0b8a-403b-90e3-a4fb9d27ff0d tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.215s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.224914] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.847s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.225178] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Acquiring lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1440.225398] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.225570] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.227568] env[68492]: INFO nova.compute.manager [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Terminating instance [ 1440.229330] env[68492]: DEBUG nova.compute.manager [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1440.229518] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1440.229981] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a63b6f4b-49f9-406f-9879-01aa66b009c4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.239449] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a2bcddf-74e5-47c9-9ef2-79cf8d541c71 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.251431] env[68492]: DEBUG nova.compute.manager [None req-1b4f5886-26eb-4e75-ba9b-0eab140140cf tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] [instance: 74853d33-dc81-497b-9af3-72973e20e60b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1440.272240] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a could not be found. [ 1440.272447] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1440.272622] env[68492]: INFO nova.compute.manager [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1440.272863] env[68492]: DEBUG oslo.service.loopingcall [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1440.273097] env[68492]: DEBUG nova.compute.manager [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1440.273197] env[68492]: DEBUG nova.network.neutron [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1440.283142] env[68492]: DEBUG nova.compute.manager [None req-1b4f5886-26eb-4e75-ba9b-0eab140140cf tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] [instance: 74853d33-dc81-497b-9af3-72973e20e60b] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1440.295498] env[68492]: DEBUG nova.network.neutron [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1440.303385] env[68492]: INFO nova.compute.manager [-] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] Took 0.03 seconds to deallocate network for instance. [ 1440.308310] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1b4f5886-26eb-4e75-ba9b-0eab140140cf tempest-AttachVolumeShelveTestJSON-975630336 tempest-AttachVolumeShelveTestJSON-975630336-project-member] Lock "74853d33-dc81-497b-9af3-72973e20e60b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.341s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.317858] env[68492]: DEBUG nova.compute.manager [None req-6aba1fe3-3953-4b08-abd8-cda96828956d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: f5dde0b2-1403-466c-aa23-a5573915256d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1440.349454] env[68492]: DEBUG nova.compute.manager [None req-6aba1fe3-3953-4b08-abd8-cda96828956d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: f5dde0b2-1403-466c-aa23-a5573915256d] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1440.370206] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6aba1fe3-3953-4b08-abd8-cda96828956d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "f5dde0b2-1403-466c-aa23-a5573915256d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.593s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.385229] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1440.407142] env[68492]: DEBUG oslo_concurrency.lockutils [None req-57cae674-0799-4111-9d4b-26d72797349c tempest-ServerActionsTestOtherA-404498483 tempest-ServerActionsTestOtherA-404498483-project-member] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.408189] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 280.163s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.408477] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1440.408672] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "bcf3ddfb-e22c-476a-ae02-3ffd6289ec4a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.437353] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1440.437601] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1440.439131] env[68492]: INFO nova.compute.claims [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1440.721257] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c273e3-75dd-493c-a884-08d4edbf6ac4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.728759] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2fdb0f7-3901-4aee-9dab-36d4ae9c0d92 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.758772] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5624164-8cc5-4aa5-8c62-042f3225aec3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.765621] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd2eee83-c23e-41cd-aadc-d58390a11b62 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.778620] env[68492]: DEBUG nova.compute.provider_tree [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1440.787430] env[68492]: DEBUG nova.scheduler.client.report [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1440.802957] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.803409] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1440.839388] env[68492]: DEBUG nova.compute.utils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1440.840890] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1440.840890] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1440.853890] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1440.919343] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1440.935982] env[68492]: DEBUG nova.policy [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06d98ba654414d2091d24b5304834776', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbfde028d2494faca2e128b80c7c6a0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1440.951290] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1440.951524] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1440.951681] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1440.951867] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1440.952027] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1440.952181] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1440.952384] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1440.952543] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1440.952707] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1440.952869] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1440.953064] env[68492]: DEBUG nova.virt.hardware [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1440.953910] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-554b89b9-5d90-43f7-b420-57a7864388cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1440.963056] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76cc2a8a-cdef-4d01-acdf-ccf76ccd49ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1441.414291] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Successfully created port: 76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1442.315057] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "03afef99-e2dd-4467-8426-fbe50481aa6f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.468543] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Successfully updated port: 76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1442.490406] env[68492]: DEBUG nova.compute.manager [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Received event network-vif-plugged-76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1442.490633] env[68492]: DEBUG oslo_concurrency.lockutils [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] Acquiring lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.490843] env[68492]: DEBUG oslo_concurrency.lockutils [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1442.494269] env[68492]: DEBUG oslo_concurrency.lockutils [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1442.494269] env[68492]: DEBUG nova.compute.manager [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] No waiting events found dispatching network-vif-plugged-76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1442.494269] env[68492]: WARNING nova.compute.manager [req-717edcf0-1d7b-4ccf-92eb-c2b14eed9d05 req-b434873f-5b2b-42fb-a88e-4a40ee772f7d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Received unexpected event network-vif-plugged-76d8bb84-fdca-48c5-8cfe-c1e625e088dd for instance with vm_state building and task_state deleting. [ 1442.494269] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1442.494419] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1442.494419] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1442.563111] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1442.737516] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updating instance_info_cache with network_info: [{"id": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "address": "fa:16:3e:e3:cf:f6", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76d8bb84-fd", "ovs_interfaceid": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1442.751917] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1442.752252] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance network_info: |[{"id": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "address": "fa:16:3e:e3:cf:f6", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76d8bb84-fd", "ovs_interfaceid": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1442.752579] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e3:cf:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cd098b1c-636f-492d-b5ae-037cb0cae454', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '76d8bb84-fdca-48c5-8cfe-c1e625e088dd', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1442.760022] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating folder: Project (bbfde028d2494faca2e128b80c7c6a0d). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1442.760731] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-43fdc0b5-0d1c-41de-8e3f-8078ee27a225 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.770933] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created folder: Project (bbfde028d2494faca2e128b80c7c6a0d) in parent group-v677434. [ 1442.771125] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating folder: Instances. Parent ref: group-v677526. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1442.771343] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e9e7f1c2-7df5-4d1e-8848-a18e89e9c130 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.779469] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created folder: Instances in parent group-v677526. [ 1442.779684] env[68492]: DEBUG oslo.service.loopingcall [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1442.779909] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1442.780128] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dea5bb2e-943a-4f43-a3aa-77d2675fc1a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1442.798549] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1442.798549] env[68492]: value = "task-3395495" [ 1442.798549] env[68492]: _type = "Task" [ 1442.798549] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1442.807578] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395495, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1443.308172] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395495, 'name': CreateVM_Task, 'duration_secs': 0.39555} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1443.308340] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1443.309070] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1443.309224] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1443.309546] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1443.309810] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54b3c96f-4ba1-4839-b4eb-772b69c694ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1443.314119] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 1443.314119] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5286168c-f4d0-5082-fddd-a2b66fa85e79" [ 1443.314119] env[68492]: _type = "Task" [ 1443.314119] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1443.322278] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5286168c-f4d0-5082-fddd-a2b66fa85e79, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1443.824549] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1443.824936] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1443.825013] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1444.516339] env[68492]: DEBUG nova.compute.manager [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Received event network-changed-76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1444.516535] env[68492]: DEBUG nova.compute.manager [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Refreshing instance network info cache due to event network-changed-76d8bb84-fdca-48c5-8cfe-c1e625e088dd. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1444.516826] env[68492]: DEBUG oslo_concurrency.lockutils [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] Acquiring lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1444.516988] env[68492]: DEBUG oslo_concurrency.lockutils [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] Acquired lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1444.517329] env[68492]: DEBUG nova.network.neutron [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Refreshing network info cache for port 76d8bb84-fdca-48c5-8cfe-c1e625e088dd {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1444.964753] env[68492]: DEBUG nova.network.neutron [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updated VIF entry in instance network info cache for port 76d8bb84-fdca-48c5-8cfe-c1e625e088dd. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1444.965127] env[68492]: DEBUG nova.network.neutron [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updating instance_info_cache with network_info: [{"id": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "address": "fa:16:3e:e3:cf:f6", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76d8bb84-fd", "ovs_interfaceid": "76d8bb84-fdca-48c5-8cfe-c1e625e088dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1444.974448] env[68492]: DEBUG oslo_concurrency.lockutils [req-00179dce-1d22-4d3e-bcb8-2b07d0a511bf req-75ef1c07-2b61-49ce-b251-4a7304330f5d service nova] Releasing lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1456.676865] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "a90e989d-6aef-482f-b767-8dbdd7f29628" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1456.677181] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1482.235524] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1482.258588] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.230931] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.231265] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1484.231305] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1484.251353] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.251501] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.251634] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.251762] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.251886] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252015] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252181] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252316] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252439] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252557] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1484.252675] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1484.253164] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.253369] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1484.263866] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.264104] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.264279] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.264428] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1484.265786] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab657a89-63e3-4937-8953-f00e17ccd07d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.274434] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-defb5123-8f4b-4d27-ba63-aa54f52e8f45 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.288937] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69e248c0-e260-45bd-b55f-3186055bb837 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.294968] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0fff43-d2d6-4932-814b-a16a005a34c6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.323828] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180976MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1484.323961] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.324167] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.393985] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.393985] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394094] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394157] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394286] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394404] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394526] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394643] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394757] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.394895] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1484.405201] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 40087617-1982-4727-ac78-1cb6437b11c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.414499] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.423286] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.432303] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2ffaadba-8144-4c60-b055-95619cd75024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.440944] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0b8f7208-aba6-4411-9ce1-1493367220b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.450536] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.460134] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8bf43303-71b9-4a37-acfd-1915196b71f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.468406] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1484.468673] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1484.468859] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1484.665662] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8853ca-2e1c-42d7-bed6-f361788a3614 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.672876] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4161a90-4ec5-4b61-b4c9-4c5acb867838 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.703460] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43695e3f-fc9b-4f2c-87d3-bff0fea2eac6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.710797] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08b68859-f8d9-4298-b90f-2b78e3ea859a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.723551] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1484.731491] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1484.744693] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1484.744901] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.421s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.722888] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.226347] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.230975] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.231176] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1487.631510] env[68492]: WARNING oslo_vmware.rw_handles [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1487.631510] env[68492]: ERROR oslo_vmware.rw_handles [ 1487.632189] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1487.633782] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1487.634036] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Copying Virtual Disk [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/4250682c-6a76-42c3-b7b3-6d983a9f4d3f/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1487.634325] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d716d8c0-402f-4eba-87d8-d4428c1b7062 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1487.642692] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for the task: (returnval){ [ 1487.642692] env[68492]: value = "task-3395496" [ 1487.642692] env[68492]: _type = "Task" [ 1487.642692] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1487.650867] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Task: {'id': task-3395496, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1488.152541] env[68492]: DEBUG oslo_vmware.exceptions [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1488.152839] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1488.153414] env[68492]: ERROR nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1488.153414] env[68492]: Faults: ['InvalidArgument'] [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Traceback (most recent call last): [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] yield resources [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self.driver.spawn(context, instance, image_meta, [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self._fetch_image_if_missing(context, vi) [ 1488.153414] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] image_cache(vi, tmp_image_ds_loc) [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] vm_util.copy_virtual_disk( [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] session._wait_for_task(vmdk_copy_task) [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return self.wait_for_task(task_ref) [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return evt.wait() [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] result = hub.switch() [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1488.153841] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return self.greenlet.switch() [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self.f(*self.args, **self.kw) [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] raise exceptions.translate_fault(task_info.error) [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Faults: ['InvalidArgument'] [ 1488.154193] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] [ 1488.154193] env[68492]: INFO nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Terminating instance [ 1488.155813] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1488.155813] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1488.155813] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb70772c-4dc6-44e3-a741-071182ba71a7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.157927] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1488.158151] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1488.158882] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-859b1664-44c0-4676-a1ed-251289a0b4e3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.166101] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1488.166341] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f592be8-c966-49e1-943a-b11fca0c2ff4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.168740] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1488.168740] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1488.169671] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-51b91114-8273-4f45-92f0-99495b7b1f87 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.174722] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for the task: (returnval){ [ 1488.174722] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520ded13-b511-32e8-b30e-221683831216" [ 1488.174722] env[68492]: _type = "Task" [ 1488.174722] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1488.181963] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]520ded13-b511-32e8-b30e-221683831216, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1488.231710] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1488.237177] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1488.237445] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1488.238032] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Deleting the datastore file [datastore2] 913d527c-f9f8-43da-b539-d1e2e2b71528 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1488.238032] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-30946246-85fb-4134-a788-10157787325d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.244016] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for the task: (returnval){ [ 1488.244016] env[68492]: value = "task-3395498" [ 1488.244016] env[68492]: _type = "Task" [ 1488.244016] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1488.252206] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Task: {'id': task-3395498, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1488.684551] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1488.684808] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Creating directory with path [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1488.685058] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cef76f0b-3941-4622-8f16-7acdcbb79e6a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.695202] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Created directory with path [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1488.695380] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Fetch image to [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1488.695549] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1488.696260] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4274386-0a79-4424-b01e-4cbc747f586b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.702343] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a96731be-8e59-4d48-b623-ccc2c9cb23cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.711058] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6910a4ee-1fdf-40f6-9491-c2b01a6215d0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.741868] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f250c8-dd4f-4647-a5c3-3f11cef28dc1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.748904] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f6d11ada-932a-49f7-9340-c6781e31b687 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1488.753208] env[68492]: DEBUG oslo_vmware.api [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Task: {'id': task-3395498, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068885} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1488.753807] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1488.754048] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1488.754273] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1488.754451] env[68492]: INFO nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1488.756523] env[68492]: DEBUG nova.compute.claims [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1488.756711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1488.756931] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1488.774050] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1488.825032] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1488.883782] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1488.884052] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1489.031201] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-494a6de7-b494-4026-a17d-264ed7c6d22c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.038680] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-600c9f97-b5ba-486f-949f-86aa3c3e0e6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.067590] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fdb5f13-c557-40ed-9ac8-608a3672fee5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.074091] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ee71777-abe3-46f0-b4a9-b489231bafda {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.086342] env[68492]: DEBUG nova.compute.provider_tree [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1489.094374] env[68492]: DEBUG nova.scheduler.client.report [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1489.107027] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1489.107519] env[68492]: ERROR nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1489.107519] env[68492]: Faults: ['InvalidArgument'] [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Traceback (most recent call last): [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self.driver.spawn(context, instance, image_meta, [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self._fetch_image_if_missing(context, vi) [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] image_cache(vi, tmp_image_ds_loc) [ 1489.107519] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] vm_util.copy_virtual_disk( [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] session._wait_for_task(vmdk_copy_task) [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return self.wait_for_task(task_ref) [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return evt.wait() [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] result = hub.switch() [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] return self.greenlet.switch() [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1489.107895] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] self.f(*self.args, **self.kw) [ 1489.108278] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1489.108278] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] raise exceptions.translate_fault(task_info.error) [ 1489.108278] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1489.108278] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Faults: ['InvalidArgument'] [ 1489.108278] env[68492]: ERROR nova.compute.manager [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] [ 1489.108278] env[68492]: DEBUG nova.compute.utils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1489.109546] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Build of instance 913d527c-f9f8-43da-b539-d1e2e2b71528 was re-scheduled: A specified parameter was not correct: fileType [ 1489.109546] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1489.109912] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1489.110094] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1489.110265] env[68492]: DEBUG nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1489.110427] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1489.622510] env[68492]: DEBUG nova.network.neutron [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1489.634239] env[68492]: INFO nova.compute.manager [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Took 0.52 seconds to deallocate network for instance. [ 1489.731955] env[68492]: INFO nova.scheduler.client.report [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Deleted allocations for instance 913d527c-f9f8-43da-b539-d1e2e2b71528 [ 1489.755506] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7720bd6a-e610-4d64-8f59-a7ba701115f2 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 535.294s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1489.756612] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 338.420s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1489.756838] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Acquiring lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1489.757073] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1489.757246] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1489.759239] env[68492]: INFO nova.compute.manager [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Terminating instance [ 1489.760884] env[68492]: DEBUG nova.compute.manager [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1489.761086] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1489.761620] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2749c292-bbab-4a0a-ad83-48f2290c8d37 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.771853] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a75a6a-de04-4e0f-a47e-ef82b7263e21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1489.783929] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1489.804758] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 913d527c-f9f8-43da-b539-d1e2e2b71528 could not be found. [ 1489.805036] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1489.805312] env[68492]: INFO nova.compute.manager [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1489.805742] env[68492]: DEBUG oslo.service.loopingcall [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1489.806028] env[68492]: DEBUG nova.compute.manager [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1489.806134] env[68492]: DEBUG nova.network.neutron [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1489.830686] env[68492]: DEBUG nova.network.neutron [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1489.838101] env[68492]: INFO nova.compute.manager [-] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] Took 0.03 seconds to deallocate network for instance. [ 1489.839099] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1489.839341] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1489.840771] env[68492]: INFO nova.compute.claims [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1489.941294] env[68492]: DEBUG oslo_concurrency.lockutils [None req-685327d3-0f7a-4cc8-b18b-d565b5df4138 tempest-ServerRescueTestJSON-235528625 tempest-ServerRescueTestJSON-235528625-project-member] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1489.942133] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 329.696s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1489.942320] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 913d527c-f9f8-43da-b539-d1e2e2b71528] During sync_power_state the instance has a pending task (deleting). Skip. [ 1489.942574] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "913d527c-f9f8-43da-b539-d1e2e2b71528" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1490.071396] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9804c265-7ed1-420c-bf7d-c07348d4a6d9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.079708] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e88b5b-aa28-4217-a907-0834ecfb54dc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.110406] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-534d168e-9176-48d3-8d07-3d7a0ae31061 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.117952] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3340d58c-0faa-48d9-8974-a6eb441679cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.131158] env[68492]: DEBUG nova.compute.provider_tree [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1490.139755] env[68492]: DEBUG nova.scheduler.client.report [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1490.153873] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1490.154375] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1490.186902] env[68492]: DEBUG nova.compute.utils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1490.188476] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1490.188643] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1490.198486] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1490.230431] env[68492]: INFO nova.virt.block_device [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Booting with volume 76e52b39-acb3-4e5d-bd8a-19483b9f2e43 at /dev/sda [ 1490.232571] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.243982] env[68492]: DEBUG nova.policy [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f092d23e6e3646bf8036b38fe424e9f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de1c7f873d504a5394cf856387e69e3d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1490.266766] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ed868d88-c6ac-406c-a9cd-0678276ce277 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.280574] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b15f92-cc97-4c91-a17e-73d945a27d1b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.312573] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b1744b59-a849-4ed5-9772-e29637b31ec0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.320482] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd5ba42-e425-4c75-832e-ce36403fe649 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.350946] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd067cce-370c-4b1a-9b8a-5b3681a0716b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.357715] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa9ba220-7285-4957-a08b-fdc82d95e0a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.372020] env[68492]: DEBUG nova.virt.block_device [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updating existing volume attachment record: f90144a9-aa5a-4859-a7f5-9061e3b1aaab {{(pid=68492) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1490.578177] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1490.578735] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1490.578941] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1490.579119] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1490.579300] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1490.579445] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1490.579589] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1490.579792] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1490.579951] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1490.580605] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1490.580864] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1490.581117] env[68492]: DEBUG nova.virt.hardware [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1490.582249] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe3766c-1ee6-4dbb-9fc7-94c7a1b8412e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.591769] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c9d56ff-0c0f-4ed8-8697-54d4b05686a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.982409] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Successfully created port: f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1491.646036] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Successfully updated port: f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1491.662083] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1491.662283] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquired lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1491.662583] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1491.703650] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1491.741420] env[68492]: DEBUG nova.compute.manager [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Received event network-vif-plugged-f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1491.741642] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Acquiring lock "40087617-1982-4727-ac78-1cb6437b11c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1491.741851] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Lock "40087617-1982-4727-ac78-1cb6437b11c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1491.742291] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Lock "40087617-1982-4727-ac78-1cb6437b11c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1491.742538] env[68492]: DEBUG nova.compute.manager [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] No waiting events found dispatching network-vif-plugged-f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1491.742711] env[68492]: WARNING nova.compute.manager [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Received unexpected event network-vif-plugged-f1421022-59f5-463f-8c9e-793846976966 for instance with vm_state building and task_state spawning. [ 1491.742873] env[68492]: DEBUG nova.compute.manager [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Received event network-changed-f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1491.743054] env[68492]: DEBUG nova.compute.manager [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Refreshing instance network info cache due to event network-changed-f1421022-59f5-463f-8c9e-793846976966. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1491.743242] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Acquiring lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1492.006334] env[68492]: DEBUG nova.network.neutron [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updating instance_info_cache with network_info: [{"id": "f1421022-59f5-463f-8c9e-793846976966", "address": "fa:16:3e:5a:fd:e3", "network": {"id": "3e8b00b2-eb85-4391-b1a8-6492e68ae004", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2097811126-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de1c7f873d504a5394cf856387e69e3d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb18870e-f482-4c7b-8cd4-5c933d3ad294", "external-id": "nsx-vlan-transportzone-76", "segmentation_id": 76, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1421022-59", "ovs_interfaceid": "f1421022-59f5-463f-8c9e-793846976966", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1492.022904] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Releasing lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1492.023424] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance network_info: |[{"id": "f1421022-59f5-463f-8c9e-793846976966", "address": "fa:16:3e:5a:fd:e3", "network": {"id": "3e8b00b2-eb85-4391-b1a8-6492e68ae004", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2097811126-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de1c7f873d504a5394cf856387e69e3d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb18870e-f482-4c7b-8cd4-5c933d3ad294", "external-id": "nsx-vlan-transportzone-76", "segmentation_id": 76, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1421022-59", "ovs_interfaceid": "f1421022-59f5-463f-8c9e-793846976966", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1492.023735] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Acquired lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1492.023911] env[68492]: DEBUG nova.network.neutron [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Refreshing network info cache for port f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1492.025762] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5a:fd:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eb18870e-f482-4c7b-8cd4-5c933d3ad294', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f1421022-59f5-463f-8c9e-793846976966', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1492.033692] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Creating folder: Project (de1c7f873d504a5394cf856387e69e3d). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1492.034904] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e6edb034-1def-440a-b74a-c7eaa03972d7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.050654] env[68492]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1492.050812] env[68492]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=68492) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1492.051140] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Folder already exists: Project (de1c7f873d504a5394cf856387e69e3d). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1492.051442] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Creating folder: Instances. Parent ref: group-v677516. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1492.051682] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9112ea60-d23c-4401-a04a-5899ddf5c4e0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.062559] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Created folder: Instances in parent group-v677516. [ 1492.062832] env[68492]: DEBUG oslo.service.loopingcall [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1492.063033] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1492.063240] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b2f8c632-53fd-48e0-b7bf-fb53e026fa58 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.086108] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1492.086108] env[68492]: value = "task-3395501" [ 1492.086108] env[68492]: _type = "Task" [ 1492.086108] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1492.093426] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395501, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1492.575234] env[68492]: DEBUG nova.network.neutron [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updated VIF entry in instance network info cache for port f1421022-59f5-463f-8c9e-793846976966. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1492.575604] env[68492]: DEBUG nova.network.neutron [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updating instance_info_cache with network_info: [{"id": "f1421022-59f5-463f-8c9e-793846976966", "address": "fa:16:3e:5a:fd:e3", "network": {"id": "3e8b00b2-eb85-4391-b1a8-6492e68ae004", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2097811126-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de1c7f873d504a5394cf856387e69e3d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb18870e-f482-4c7b-8cd4-5c933d3ad294", "external-id": "nsx-vlan-transportzone-76", "segmentation_id": 76, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1421022-59", "ovs_interfaceid": "f1421022-59f5-463f-8c9e-793846976966", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1492.586232] env[68492]: DEBUG oslo_concurrency.lockutils [req-dfcc9627-30d2-4462-ae29-b93abd1857ac req-6b2081f2-e77f-44c3-ab25-ba8c5ca8bd97 service nova] Releasing lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1492.596771] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395501, 'name': CreateVM_Task, 'duration_secs': 0.30079} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1492.596984] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1492.597764] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'guest_format': None, 'attachment_id': 'f90144a9-aa5a-4859-a7f5-9061e3b1aaab', 'disk_bus': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677519', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'name': 'volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '40087617-1982-4727-ac78-1cb6437b11c9', 'attached_at': '', 'detached_at': '', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'serial': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43'}, 'device_type': None, 'boot_index': 0, 'delete_on_termination': True, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=68492) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1492.598027] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Root volume attach. Driver type: vmdk {{(pid=68492) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1492.599185] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a63992d-fc9e-49d6-9aad-6238ac214de1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.607644] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66615d62-a103-43e6-888f-7195c2c2d7f3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.613963] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f807881f-780c-4ea8-8311-0a624b3e8085 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.619836] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-2df0ba7f-e3f1-4c05-a78e-9a1b9cb5f435 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1492.626720] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1492.626720] env[68492]: value = "task-3395502" [ 1492.626720] env[68492]: _type = "Task" [ 1492.626720] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1492.634636] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1493.139959] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 40%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1493.639793] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 53%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.140443] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 67%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.641744] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 82%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1495.140545] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 97%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1495.641971] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task} progress is 98%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1496.142226] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395502, 'name': RelocateVM_Task, 'duration_secs': 3.107596} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1496.142620] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Volume attach. Driver type: vmdk {{(pid=68492) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1496.142676] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677519', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'name': 'volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '40087617-1982-4727-ac78-1cb6437b11c9', 'attached_at': '', 'detached_at': '', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'serial': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43'} {{(pid=68492) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1496.143431] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-501593bf-2365-4b8d-858c-e689f84345c0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.158364] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf812cc-48d3-4c48-bb59-3d6484586b3a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.180372] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Reconfiguring VM instance instance-00000050 to attach disk [datastore2] volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43/volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43.vmdk or device None with type thin {{(pid=68492) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1496.180632] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-81e3f64b-926d-49d7-b692-2af55c7d6d9b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.199317] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1496.199317] env[68492]: value = "task-3395503" [ 1496.199317] env[68492]: _type = "Task" [ 1496.199317] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1496.206527] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395503, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1496.712517] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395503, 'name': ReconfigVM_Task, 'duration_secs': 0.314828} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1496.712873] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Reconfigured VM instance instance-00000050 to attach disk [datastore2] volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43/volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43.vmdk or device None with type thin {{(pid=68492) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1496.717778] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c82ccf5b-056d-4d5e-b97f-f7d9cacd9b00 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.732449] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1496.732449] env[68492]: value = "task-3395504" [ 1496.732449] env[68492]: _type = "Task" [ 1496.732449] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1496.740339] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395504, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1497.243164] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395504, 'name': ReconfigVM_Task, 'duration_secs': 0.119459} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1497.243491] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677519', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'name': 'volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '40087617-1982-4727-ac78-1cb6437b11c9', 'attached_at': '', 'detached_at': '', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'serial': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43'} {{(pid=68492) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1497.244020] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-062aa5af-e469-4726-b0b6-3ca23e441ce6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.254355] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1497.254355] env[68492]: value = "task-3395505" [ 1497.254355] env[68492]: _type = "Task" [ 1497.254355] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1497.262357] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395505, 'name': Rename_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1497.766802] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395505, 'name': Rename_Task, 'duration_secs': 0.123445} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1497.767629] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Powering on the VM {{(pid=68492) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1497.767629] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-ab2e435b-f239-488e-ad0f-8c20ea581105 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.774814] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1497.774814] env[68492]: value = "task-3395506" [ 1497.774814] env[68492]: _type = "Task" [ 1497.774814] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1497.782781] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395506, 'name': PowerOnVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.285027] env[68492]: DEBUG oslo_vmware.api [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395506, 'name': PowerOnVM_Task, 'duration_secs': 0.415516} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1498.285348] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Powered on the VM {{(pid=68492) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1498.285533] env[68492]: INFO nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Took 7.71 seconds to spawn the instance on the hypervisor. [ 1498.285713] env[68492]: DEBUG nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Checking state {{(pid=68492) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1498.286476] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be687714-db1f-459d-81f1-66b09413f16d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.338089] env[68492]: INFO nova.compute.manager [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Took 8.51 seconds to build instance. [ 1498.355410] env[68492]: DEBUG oslo_concurrency.lockutils [None req-60498ba1-3839-4ccc-800c-90f1f67590c5 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 151.865s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1498.364872] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1498.415364] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1498.415670] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1498.417202] env[68492]: INFO nova.compute.claims [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1498.695130] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f496815a-0a08-46e8-a142-2f72c4eb99b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.702340] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1be45dcb-b0e8-4917-9a5e-a14a83ec8cb7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.733014] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b81c52-6ccf-491a-b6e7-7016499185c0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.739956] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01201598-45ce-4a8f-a7ca-9c98485c748c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.752638] env[68492]: DEBUG nova.compute.provider_tree [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1498.762929] env[68492]: DEBUG nova.scheduler.client.report [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1498.778350] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1498.778875] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1498.818307] env[68492]: DEBUG nova.compute.utils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1498.821377] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1498.821377] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1498.830309] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1498.899242] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1498.927752] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1498.927996] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1498.928224] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1498.928429] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1498.928577] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1498.928723] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1498.928926] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1498.929097] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1498.929267] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1498.929429] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1498.929598] env[68492]: DEBUG nova.virt.hardware [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1498.930479] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a90fb25-806c-41be-b0cf-4cabbb7327fc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.936768] env[68492]: DEBUG nova.policy [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b18e8b4357d42bf9355a8f987f9f304', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1058a8bbbed0405e9f89cbbc727969e8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1498.941554] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb26aa65-6237-4bd5-9f4b-66d5c5ad9643 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.408048] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Successfully created port: ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1500.239025] env[68492]: DEBUG nova.compute.manager [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Received event network-vif-plugged-ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1500.239025] env[68492]: DEBUG oslo_concurrency.lockutils [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] Acquiring lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.239025] env[68492]: DEBUG oslo_concurrency.lockutils [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.239025] env[68492]: DEBUG oslo_concurrency.lockutils [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.239380] env[68492]: DEBUG nova.compute.manager [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] No waiting events found dispatching network-vif-plugged-ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1500.239380] env[68492]: WARNING nova.compute.manager [req-97bf5ad8-2e45-4486-b50a-e9eb839cbabe req-2902886b-7780-4ca3-b73f-c87db6eed203 service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Received unexpected event network-vif-plugged-ddb5439a-55d5-40aa-937c-91e900ee13ec for instance with vm_state building and task_state spawning. [ 1500.255046] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Successfully updated port: ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1500.269564] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1500.269855] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquired lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1500.269936] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1500.307515] env[68492]: DEBUG nova.compute.manager [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Received event network-changed-f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1500.307674] env[68492]: DEBUG nova.compute.manager [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Refreshing instance network info cache due to event network-changed-f1421022-59f5-463f-8c9e-793846976966. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1500.307886] env[68492]: DEBUG oslo_concurrency.lockutils [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] Acquiring lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1500.308035] env[68492]: DEBUG oslo_concurrency.lockutils [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] Acquired lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1500.308244] env[68492]: DEBUG nova.network.neutron [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Refreshing network info cache for port f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1500.341469] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1500.599468] env[68492]: DEBUG nova.network.neutron [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updated VIF entry in instance network info cache for port f1421022-59f5-463f-8c9e-793846976966. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1500.600071] env[68492]: DEBUG nova.network.neutron [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updating instance_info_cache with network_info: [{"id": "f1421022-59f5-463f-8c9e-793846976966", "address": "fa:16:3e:5a:fd:e3", "network": {"id": "3e8b00b2-eb85-4391-b1a8-6492e68ae004", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2097811126-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.145", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de1c7f873d504a5394cf856387e69e3d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eb18870e-f482-4c7b-8cd4-5c933d3ad294", "external-id": "nsx-vlan-transportzone-76", "segmentation_id": 76, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1421022-59", "ovs_interfaceid": "f1421022-59f5-463f-8c9e-793846976966", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1500.614743] env[68492]: DEBUG oslo_concurrency.lockutils [req-32eb31bd-40ba-4874-8136-c2886f2cf99d req-a398fd31-84d2-4698-9d0c-88d6e6c1e65d service nova] Releasing lock "refresh_cache-40087617-1982-4727-ac78-1cb6437b11c9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.885354] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Updating instance_info_cache with network_info: [{"id": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "address": "fa:16:3e:85:01:07", "network": {"id": "25d9f81c-66e0-4bfb-a5e2-b8729fe74375", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1378695200-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1058a8bbbed0405e9f89cbbc727969e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddb5439a-55", "ovs_interfaceid": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1500.896278] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Releasing lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.896572] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance network_info: |[{"id": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "address": "fa:16:3e:85:01:07", "network": {"id": "25d9f81c-66e0-4bfb-a5e2-b8729fe74375", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1378695200-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1058a8bbbed0405e9f89cbbc727969e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddb5439a-55", "ovs_interfaceid": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1500.896972] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:85:01:07', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '53ef6889-a40c-40f5-a6e5-d8726606296a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ddb5439a-55d5-40aa-937c-91e900ee13ec', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1500.905359] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Creating folder: Project (1058a8bbbed0405e9f89cbbc727969e8). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1500.906648] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-40ea9861-f6be-4685-8c7f-88f8de1fd849 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.917829] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Created folder: Project (1058a8bbbed0405e9f89cbbc727969e8) in parent group-v677434. [ 1500.918310] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Creating folder: Instances. Parent ref: group-v677531. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1500.918310] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-654ca067-9c8c-49b9-86bb-6d8e9a9bbd6c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.927501] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Created folder: Instances in parent group-v677531. [ 1500.927771] env[68492]: DEBUG oslo.service.loopingcall [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1500.927926] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1500.928161] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8ffe1ee-69b0-4def-90d0-2938f8f57726 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.947648] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1500.947648] env[68492]: value = "task-3395509" [ 1500.947648] env[68492]: _type = "Task" [ 1500.947648] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1500.955359] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395509, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1501.457480] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395509, 'name': CreateVM_Task, 'duration_secs': 0.277431} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1501.457647] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1501.458323] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1501.458488] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1501.458811] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1501.459087] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e106989a-127e-4966-85db-3e8f09146226 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1501.463360] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for the task: (returnval){ [ 1501.463360] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52de677c-56c2-9e64-5393-58652b842b8c" [ 1501.463360] env[68492]: _type = "Task" [ 1501.463360] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1501.470545] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52de677c-56c2-9e64-5393-58652b842b8c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1501.973630] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1501.973943] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1501.974119] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1502.264413] env[68492]: DEBUG nova.compute.manager [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Received event network-changed-ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1502.264591] env[68492]: DEBUG nova.compute.manager [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Refreshing instance network info cache due to event network-changed-ddb5439a-55d5-40aa-937c-91e900ee13ec. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1502.264813] env[68492]: DEBUG oslo_concurrency.lockutils [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] Acquiring lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1502.264962] env[68492]: DEBUG oslo_concurrency.lockutils [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] Acquired lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1502.265387] env[68492]: DEBUG nova.network.neutron [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Refreshing network info cache for port ddb5439a-55d5-40aa-937c-91e900ee13ec {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1502.554771] env[68492]: DEBUG nova.network.neutron [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Updated VIF entry in instance network info cache for port ddb5439a-55d5-40aa-937c-91e900ee13ec. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1502.555157] env[68492]: DEBUG nova.network.neutron [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Updating instance_info_cache with network_info: [{"id": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "address": "fa:16:3e:85:01:07", "network": {"id": "25d9f81c-66e0-4bfb-a5e2-b8729fe74375", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1378695200-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1058a8bbbed0405e9f89cbbc727969e8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ef6889-a40c-40f5-a6e5-d8726606296a", "external-id": "nsx-vlan-transportzone-537", "segmentation_id": 537, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddb5439a-55", "ovs_interfaceid": "ddb5439a-55d5-40aa-937c-91e900ee13ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1502.566085] env[68492]: DEBUG oslo_concurrency.lockutils [req-726d0181-56ed-49b1-9077-c6d38cedef11 req-c8cb0ccc-c75a-4b4c-b710-ea459ab1f0ce service nova] Releasing lock "refresh_cache-b0757e62-96ca-4758-8444-dcc98fbf0a29" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1516.645567] env[68492]: INFO nova.compute.manager [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Rebuilding instance [ 1516.690908] env[68492]: DEBUG nova.compute.manager [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Checking state {{(pid=68492) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1516.690908] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5b7a3c3-baaa-4aef-a2ed-4ed55638ac8f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.728537] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Powering off the VM {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1516.729058] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-6ccb0c1b-78f2-4ce6-9ae0-5c4cb604aaf2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.735946] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1516.735946] env[68492]: value = "task-3395510" [ 1516.735946] env[68492]: _type = "Task" [ 1516.735946] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1516.744421] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395510, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1517.245557] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395510, 'name': PowerOffVM_Task, 'duration_secs': 0.162963} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1517.246213] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Powered off the VM {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1517.246949] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Powering off the VM {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1517.247255] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-0d2ea73c-1063-4d23-84f4-ebce6211c83d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.252760] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1517.252760] env[68492]: value = "task-3395511" [ 1517.252760] env[68492]: _type = "Task" [ 1517.252760] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1517.260295] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395511, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1517.763135] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] VM already powered off {{(pid=68492) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 1517.763447] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Volume detach. Driver type: vmdk {{(pid=68492) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1517.763538] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677519', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'name': 'volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '40087617-1982-4727-ac78-1cb6437b11c9', 'attached_at': '', 'detached_at': '', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'serial': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43'} {{(pid=68492) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1517.764334] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-486311e8-b5db-4b42-a26a-95387b92fa21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.781518] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02f42443-a25d-40ff-b0fc-4223b089e5a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.788052] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05e0969c-a5ba-4ae9-b275-ca3eeb299114 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.804848] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2255496-e5af-4f90-a042-36ca41b87342 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.820645] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] The volume has not been displaced from its original location: [datastore2] volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43/volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43.vmdk. No consolidation needed. {{(pid=68492) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1517.825861] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Reconfiguring VM instance instance-00000050 to detach disk 2000 {{(pid=68492) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1517.826172] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-276d6cb9-58e4-4df7-8b9b-8ca140db49e5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.843593] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1517.843593] env[68492]: value = "task-3395512" [ 1517.843593] env[68492]: _type = "Task" [ 1517.843593] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1517.851490] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395512, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1518.354171] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395512, 'name': ReconfigVM_Task, 'duration_secs': 0.172498} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1518.354448] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Reconfigured VM instance instance-00000050 to detach disk 2000 {{(pid=68492) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1518.359053] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-dfcdf0d4-2fb0-46d8-bce4-730ea1c5827f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.373312] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1518.373312] env[68492]: value = "task-3395513" [ 1518.373312] env[68492]: _type = "Task" [ 1518.373312] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1518.381058] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395513, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1518.884878] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395513, 'name': ReconfigVM_Task, 'duration_secs': 0.097725} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1518.885361] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-677519', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'name': 'volume-76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '40087617-1982-4727-ac78-1cb6437b11c9', 'attached_at': '', 'detached_at': '', 'volume_id': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43', 'serial': '76e52b39-acb3-4e5d-bd8a-19483b9f2e43'} {{(pid=68492) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1518.885756] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1518.886937] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93e11d5-4015-4c0c-a3ce-7d443b241dd8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.894911] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1518.895158] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d248fd2e-a614-49eb-ab07-a2380e6bce8f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.954380] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1518.954380] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1518.954598] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Deleting the datastore file [datastore2] 40087617-1982-4727-ac78-1cb6437b11c9 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1518.954718] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7f48d9f6-c14b-4643-9712-417ed1cf37f6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.962910] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for the task: (returnval){ [ 1518.962910] env[68492]: value = "task-3395515" [ 1518.962910] env[68492]: _type = "Task" [ 1518.962910] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1518.971313] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395515, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1519.473591] env[68492]: DEBUG oslo_vmware.api [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Task: {'id': task-3395515, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083281} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1519.473842] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1519.474033] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1519.474214] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1519.526378] env[68492]: DEBUG nova.virt.vmwareapi.volumeops [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Volume detach. Driver type: vmdk {{(pid=68492) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1519.526703] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ac256f81-bb58-4c2c-ad3f-cc1ff0c7393b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.534779] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c029f42-457c-468f-aa7d-e7ae696957c8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.567454] env[68492]: ERROR nova.compute.manager [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Failed to detach volume 76e52b39-acb3-4e5d-bd8a-19483b9f2e43 from /dev/sda: nova.exception.InstanceNotFound: Instance 40087617-1982-4727-ac78-1cb6437b11c9 could not be found. [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Traceback (most recent call last): [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 4117, in _do_rebuild_instance [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self.driver.rebuild(**kwargs) [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise NotImplementedError() [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] NotImplementedError [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] During handling of the above exception, another exception occurred: [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Traceback (most recent call last): [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3540, in _detach_root_volume [ 1519.567454] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self.driver.detach_volume(context, old_connection_info, [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] return self._volumeops.detach_volume(connection_info, instance) [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._detach_volume_vmdk(connection_info, instance) [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] stable_ref.fetch_moref(session) [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] nova.exception.InstanceNotFound: Instance 40087617-1982-4727-ac78-1cb6437b11c9 could not be found. [ 1519.567959] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.685838] env[68492]: DEBUG nova.compute.utils [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Build of instance 40087617-1982-4727-ac78-1cb6437b11c9 aborted: Failed to rebuild volume backed instance. {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1519.688150] env[68492]: ERROR nova.compute.manager [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 40087617-1982-4727-ac78-1cb6437b11c9 aborted: Failed to rebuild volume backed instance. [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Traceback (most recent call last): [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 4117, in _do_rebuild_instance [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self.driver.rebuild(**kwargs) [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise NotImplementedError() [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] NotImplementedError [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] During handling of the above exception, another exception occurred: [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Traceback (most recent call last): [ 1519.688150] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3575, in _rebuild_volume_backed_instance [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._detach_root_volume(context, instance, root_bdm) [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3554, in _detach_root_volume [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] with excutils.save_and_reraise_exception(): [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self.force_reraise() [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise self.value [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3540, in _detach_root_volume [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self.driver.detach_volume(context, old_connection_info, [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] return self._volumeops.detach_volume(connection_info, instance) [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1519.688520] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._detach_volume_vmdk(connection_info, instance) [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] stable_ref.fetch_moref(session) [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] nova.exception.InstanceNotFound: Instance 40087617-1982-4727-ac78-1cb6437b11c9 could not be found. [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] During handling of the above exception, another exception occurred: [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Traceback (most recent call last): [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 10841, in _error_out_instance_on_exception [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] yield [ 1519.688914] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3843, in rebuild_instance [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._do_rebuild_instance_with_claim( [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3929, in _do_rebuild_instance_with_claim [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._do_rebuild_instance( [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 4121, in _do_rebuild_instance [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._rebuild_default_impl(**kwargs) [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3698, in _rebuild_default_impl [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] self._rebuild_volume_backed_instance( [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] File "/opt/stack/nova/nova/compute/manager.py", line 3590, in _rebuild_volume_backed_instance [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] raise exception.BuildAbortException( [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] nova.exception.BuildAbortException: Build of instance 40087617-1982-4727-ac78-1cb6437b11c9 aborted: Failed to rebuild volume backed instance. [ 1519.689545] env[68492]: ERROR nova.compute.manager [instance: 40087617-1982-4727-ac78-1cb6437b11c9] [ 1519.773651] env[68492]: DEBUG oslo_concurrency.lockutils [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1519.773902] env[68492]: DEBUG oslo_concurrency.lockutils [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1519.939416] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a5e274-d7ec-4e70-ac7a-ea52abcf4b0b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.948665] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c77264-b1bc-4e11-89a8-8f43908fadc1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.979541] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1e0b23f-04d5-4e9c-b895-5382112920df {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.987467] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-743ff564-4166-43ff-8137-42f6073bb753 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.001594] env[68492]: DEBUG nova.compute.provider_tree [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1520.009011] env[68492]: DEBUG nova.scheduler.client.report [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1520.022852] env[68492]: DEBUG oslo_concurrency.lockutils [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.249s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1520.023059] env[68492]: INFO nova.compute.manager [None req-59a152a1-b30a-4ab5-a8d7-bd8469577f17 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Successfully reverted task state from rebuilding on failure for instance. [ 1520.481383] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "40087617-1982-4727-ac78-1cb6437b11c9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1520.481766] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1520.481834] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "40087617-1982-4727-ac78-1cb6437b11c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1520.482024] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1520.482260] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1520.484742] env[68492]: INFO nova.compute.manager [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Terminating instance [ 1520.486615] env[68492]: DEBUG nova.compute.manager [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1520.486903] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-946027e2-2f1a-4c0a-aa93-1f166a53e72c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.496531] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564e48c3-20cf-41fa-a346-63aa7819f6c2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.526881] env[68492]: WARNING nova.virt.vmwareapi.driver [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 40087617-1982-4727-ac78-1cb6437b11c9 could not be found. [ 1520.527118] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1520.527473] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2bbed9c4-aba9-46f2-9350-6e28f2ade262 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.535238] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ffd52ff-b0d6-435e-9fc7-16f1dc78fac5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1520.563063] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 40087617-1982-4727-ac78-1cb6437b11c9 could not be found. [ 1520.563274] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1520.563458] env[68492]: INFO nova.compute.manager [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Took 0.08 seconds to destroy the instance on the hypervisor. [ 1520.563697] env[68492]: DEBUG oslo.service.loopingcall [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1520.563985] env[68492]: DEBUG nova.compute.manager [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1520.564100] env[68492]: DEBUG nova.network.neutron [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1521.138114] env[68492]: DEBUG nova.network.neutron [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1521.150408] env[68492]: INFO nova.compute.manager [-] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Took 0.58 seconds to deallocate network for instance. [ 1521.172132] env[68492]: DEBUG nova.compute.manager [req-051c0a65-db3d-475b-99d6-364f56b5fea1 req-d0d335aa-7b0f-4887-8b53-1c55179c8ef2 service nova] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Received event network-vif-deleted-f1421022-59f5-463f-8c9e-793846976966 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1521.225329] env[68492]: INFO nova.compute.manager [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Took 0.07 seconds to detach 1 volumes for instance. [ 1521.228023] env[68492]: DEBUG nova.compute.manager [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Deleting volume: 76e52b39-acb3-4e5d-bd8a-19483b9f2e43 {{(pid=68492) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3222}} [ 1521.302145] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1521.302448] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1521.302693] env[68492]: DEBUG nova.objects.instance [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lazy-loading 'resources' on Instance uuid 40087617-1982-4727-ac78-1cb6437b11c9 {{(pid=68492) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1521.574709] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7476d035-56d9-4172-bf16-70dcd12e36cc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1521.582441] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-badf73e5-12c7-427b-b646-bf635f1b2c9b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1521.613777] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d369b66-2f93-4448-aa92-29300bb1a482 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1521.622103] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1707f65-7613-4b5f-9ad2-cf2f4818377d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1521.635340] env[68492]: DEBUG nova.compute.provider_tree [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1521.644652] env[68492]: DEBUG nova.scheduler.client.report [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1521.658687] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.356s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1521.715548] env[68492]: DEBUG oslo_concurrency.lockutils [None req-981eb39d-1f5b-484f-aff1-86c7755d3e42 tempest-ServerActionsV293TestJSON-1981608528 tempest-ServerActionsV293TestJSON-1981608528-project-member] Lock "40087617-1982-4727-ac78-1cb6437b11c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.234s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.664042] env[68492]: WARNING oslo_vmware.rw_handles [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1535.664042] env[68492]: ERROR oslo_vmware.rw_handles [ 1535.664042] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1535.666208] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1535.666458] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Copying Virtual Disk [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/7ea08c6a-ed41-4f0f-9932-16b725523696/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1535.666767] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f799a42d-ed1d-4e41-80dd-e1eb9dc1a4b9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.674557] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for the task: (returnval){ [ 1535.674557] env[68492]: value = "task-3395517" [ 1535.674557] env[68492]: _type = "Task" [ 1535.674557] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1535.682499] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Task: {'id': task-3395517, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1536.185619] env[68492]: DEBUG oslo_vmware.exceptions [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1536.185919] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1536.186506] env[68492]: ERROR nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1536.186506] env[68492]: Faults: ['InvalidArgument'] [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Traceback (most recent call last): [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] yield resources [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self.driver.spawn(context, instance, image_meta, [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self._fetch_image_if_missing(context, vi) [ 1536.186506] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] image_cache(vi, tmp_image_ds_loc) [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] vm_util.copy_virtual_disk( [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] session._wait_for_task(vmdk_copy_task) [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return self.wait_for_task(task_ref) [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return evt.wait() [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] result = hub.switch() [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1536.186922] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return self.greenlet.switch() [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self.f(*self.args, **self.kw) [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] raise exceptions.translate_fault(task_info.error) [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Faults: ['InvalidArgument'] [ 1536.187282] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] [ 1536.187282] env[68492]: INFO nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Terminating instance [ 1536.188358] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1536.188577] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1536.188823] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-97789047-5a6d-48f6-906e-ebeb5a910748 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.191055] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1536.191251] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1536.191968] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f0b04d-7b7f-4cb1-bd8d-552c7d4b7b4c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.199046] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1536.200099] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3a5a0251-73d6-4685-b34b-e11dce4d1e3c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.201517] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1536.201690] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1536.202415] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2900c667-469e-4798-b407-614b98fb15c6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.207332] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for the task: (returnval){ [ 1536.207332] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52b4de70-84a0-9dfc-6726-40257e75fc7f" [ 1536.207332] env[68492]: _type = "Task" [ 1536.207332] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1536.215619] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52b4de70-84a0-9dfc-6726-40257e75fc7f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1536.264803] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1536.265048] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1536.265258] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Deleting the datastore file [datastore2] cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1536.265574] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-87a3beb1-446e-4cbc-9555-bd24fcf8c41d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.272807] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for the task: (returnval){ [ 1536.272807] env[68492]: value = "task-3395519" [ 1536.272807] env[68492]: _type = "Task" [ 1536.272807] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1536.280676] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Task: {'id': task-3395519, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1536.717593] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1536.717894] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Creating directory with path [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1536.718103] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9606e51-2db2-40ce-9837-83518f1a10e4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.729536] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Created directory with path [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1536.729764] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Fetch image to [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1536.729973] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1536.730718] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660d4e77-e5db-4d06-8330-03c6971a8c7a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.736857] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e2656d-766e-45f7-8180-2d44cb34264e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.745618] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d2f5168-9c06-464f-8cac-15ff343c49f5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.777673] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16425448-b8b8-44bf-bad4-f191116d99c2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.784140] env[68492]: DEBUG oslo_vmware.api [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Task: {'id': task-3395519, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061496} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1536.785600] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1536.785789] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1536.785960] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1536.786148] env[68492]: INFO nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1536.787874] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5adfe6b1-26c7-47a0-a955-8c6ed9614fb6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.789665] env[68492]: DEBUG nova.compute.claims [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1536.789834] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1536.790051] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1536.813364] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1536.864565] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1536.924755] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1536.924924] env[68492]: DEBUG oslo_vmware.rw_handles [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1537.063044] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9224e588-e36b-4ef3-aac1-834fec1994ec {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.070646] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69a56d1f-baf2-4910-b811-8a30edc8f049 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.100879] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-232d5291-7b29-4f1f-82a9-1ab53fd4182c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.107789] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63aac596-050e-47ec-8500-b127e950c257 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.121159] env[68492]: DEBUG nova.compute.provider_tree [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1537.129849] env[68492]: DEBUG nova.scheduler.client.report [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1537.143229] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.353s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.143736] env[68492]: ERROR nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1537.143736] env[68492]: Faults: ['InvalidArgument'] [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Traceback (most recent call last): [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self.driver.spawn(context, instance, image_meta, [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self._fetch_image_if_missing(context, vi) [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] image_cache(vi, tmp_image_ds_loc) [ 1537.143736] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] vm_util.copy_virtual_disk( [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] session._wait_for_task(vmdk_copy_task) [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return self.wait_for_task(task_ref) [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return evt.wait() [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] result = hub.switch() [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] return self.greenlet.switch() [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1537.144116] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] self.f(*self.args, **self.kw) [ 1537.144433] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1537.144433] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] raise exceptions.translate_fault(task_info.error) [ 1537.144433] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1537.144433] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Faults: ['InvalidArgument'] [ 1537.144433] env[68492]: ERROR nova.compute.manager [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] [ 1537.144433] env[68492]: DEBUG nova.compute.utils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1537.146058] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Build of instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 was re-scheduled: A specified parameter was not correct: fileType [ 1537.146058] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1537.146442] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1537.146610] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1537.146774] env[68492]: DEBUG nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1537.146935] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1537.533992] env[68492]: DEBUG nova.network.neutron [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1537.545434] env[68492]: INFO nova.compute.manager [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Took 0.40 seconds to deallocate network for instance. [ 1537.648998] env[68492]: INFO nova.scheduler.client.report [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Deleted allocations for instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 [ 1537.679809] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bbab8f8d-d039-464d-998e-1d1bd8af97eb tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 580.655s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.681297] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 384.847s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1537.683277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Acquiring lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1537.683277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1537.683277] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.684996] env[68492]: INFO nova.compute.manager [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Terminating instance [ 1537.687266] env[68492]: DEBUG nova.compute.manager [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1537.687643] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1537.688013] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d45fa108-1e80-4e04-88d9-6d4c13b834e5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.697166] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e0cabc-8ce9-4c52-8744-fa7fe5ed0d4b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.708484] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1537.729852] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cbadf6d3-a000-4e96-bea4-96d1c80ea3c7 could not be found. [ 1537.730109] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1537.730257] env[68492]: INFO nova.compute.manager [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1537.731062] env[68492]: DEBUG oslo.service.loopingcall [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1537.731062] env[68492]: DEBUG nova.compute.manager [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1537.731062] env[68492]: DEBUG nova.network.neutron [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1537.757778] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1537.758031] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1537.759436] env[68492]: INFO nova.compute.claims [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1537.761854] env[68492]: DEBUG nova.network.neutron [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1537.770913] env[68492]: INFO nova.compute.manager [-] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] Took 0.04 seconds to deallocate network for instance. [ 1537.866809] env[68492]: DEBUG oslo_concurrency.lockutils [None req-fefba7f3-cfb5-4d9e-a86f-b8834fc38b5a tempest-AttachInterfacesUnderV243Test-1124718926 tempest-AttachInterfacesUnderV243Test-1124718926-project-member] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.867715] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 377.622s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1537.867903] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: cbadf6d3-a000-4e96-bea4-96d1c80ea3c7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1537.868085] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "cbadf6d3-a000-4e96-bea4-96d1c80ea3c7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.983345] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-807e15fd-48bd-4b40-b293-44dc0360b6a4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.990955] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c8e135-3ef3-41aa-8a5f-bb2685e60fe6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1538.019917] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d58f484-5e56-4e15-ac26-2e73458a49ab {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1538.026353] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29b107a8-f6f4-4ee2-b202-56afbc392f5f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1538.039973] env[68492]: DEBUG nova.compute.provider_tree [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1538.050164] env[68492]: DEBUG nova.scheduler.client.report [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1538.064208] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1538.064842] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1538.094417] env[68492]: DEBUG nova.compute.utils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1538.095829] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1538.096012] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1538.103795] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1538.158447] env[68492]: DEBUG nova.policy [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be6f14b55e8c4c7685f66b73c5691050', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b091548dc7c416c9d07bb6c8cd1edf0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1538.163035] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1538.187617] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1538.187852] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1538.188018] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1538.188198] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1538.188367] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1538.188518] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1538.188717] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1538.188872] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1538.189051] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1538.189216] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1538.189385] env[68492]: DEBUG nova.virt.hardware [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1538.190238] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1deb9f6f-8e55-425c-9132-74c27c666a40 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1538.197971] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73bd3a67-99d9-4906-90b4-a8e9ff23bbc5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1538.519758] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Successfully created port: f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1539.225165] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Successfully updated port: f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1539.235172] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1539.235323] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquired lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1539.235471] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1539.276693] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1539.443036] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Updating instance_info_cache with network_info: [{"id": "f6ac14ff-aca4-4e04-a679-594140792ecc", "address": "fa:16:3e:45:cc:69", "network": {"id": "3a2796a2-829f-4991-b98e-ca4d599e9eda", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-643093598-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b091548dc7c416c9d07bb6c8cd1edf0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52c1f5eb-3d4a-4faa-a30d-2b0a46430791", "external-id": "nsx-vlan-transportzone-775", "segmentation_id": 775, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6ac14ff-ac", "ovs_interfaceid": "f6ac14ff-aca4-4e04-a679-594140792ecc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1539.453420] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Releasing lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1539.453704] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance network_info: |[{"id": "f6ac14ff-aca4-4e04-a679-594140792ecc", "address": "fa:16:3e:45:cc:69", "network": {"id": "3a2796a2-829f-4991-b98e-ca4d599e9eda", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-643093598-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b091548dc7c416c9d07bb6c8cd1edf0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52c1f5eb-3d4a-4faa-a30d-2b0a46430791", "external-id": "nsx-vlan-transportzone-775", "segmentation_id": 775, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6ac14ff-ac", "ovs_interfaceid": "f6ac14ff-aca4-4e04-a679-594140792ecc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1539.454096] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:45:cc:69', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '52c1f5eb-3d4a-4faa-a30d-2b0a46430791', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f6ac14ff-aca4-4e04-a679-594140792ecc', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1539.462177] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Creating folder: Project (5b091548dc7c416c9d07bb6c8cd1edf0). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1539.462738] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-523b9ed5-135a-4467-88b0-528db95d1cbe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1539.474265] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Created folder: Project (5b091548dc7c416c9d07bb6c8cd1edf0) in parent group-v677434. [ 1539.474447] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Creating folder: Instances. Parent ref: group-v677534. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1539.474680] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e4044ed-f76d-421d-814c-001ddbdd4571 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1539.483183] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Created folder: Instances in parent group-v677534. [ 1539.483421] env[68492]: DEBUG oslo.service.loopingcall [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1539.483620] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1539.483814] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b19fa68-2b0a-44f7-ac87-1736f5cc79f8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1539.502091] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1539.502091] env[68492]: value = "task-3395522" [ 1539.502091] env[68492]: _type = "Task" [ 1539.502091] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1539.509245] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395522, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1539.575475] env[68492]: DEBUG nova.compute.manager [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Received event network-vif-plugged-f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1539.575673] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Acquiring lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1539.575880] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1539.576059] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1539.576311] env[68492]: DEBUG nova.compute.manager [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] No waiting events found dispatching network-vif-plugged-f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1539.576422] env[68492]: WARNING nova.compute.manager [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Received unexpected event network-vif-plugged-f6ac14ff-aca4-4e04-a679-594140792ecc for instance with vm_state building and task_state spawning. [ 1539.576536] env[68492]: DEBUG nova.compute.manager [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Received event network-changed-f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1539.576729] env[68492]: DEBUG nova.compute.manager [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Refreshing instance network info cache due to event network-changed-f6ac14ff-aca4-4e04-a679-594140792ecc. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1539.576862] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Acquiring lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1539.576994] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Acquired lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1539.577157] env[68492]: DEBUG nova.network.neutron [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Refreshing network info cache for port f6ac14ff-aca4-4e04-a679-594140792ecc {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1540.012211] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395522, 'name': CreateVM_Task, 'duration_secs': 0.285213} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1540.012386] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1540.013445] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1540.013445] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1540.013670] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1540.013935] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a71f094e-3b28-481b-a841-c87e8f2fe031 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1540.018216] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for the task: (returnval){ [ 1540.018216] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e63f06-f0e4-3cfb-7542-ae2f6ff2041e" [ 1540.018216] env[68492]: _type = "Task" [ 1540.018216] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1540.025790] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e63f06-f0e4-3cfb-7542-ae2f6ff2041e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1540.217421] env[68492]: DEBUG nova.network.neutron [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Updated VIF entry in instance network info cache for port f6ac14ff-aca4-4e04-a679-594140792ecc. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1540.217804] env[68492]: DEBUG nova.network.neutron [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Updating instance_info_cache with network_info: [{"id": "f6ac14ff-aca4-4e04-a679-594140792ecc", "address": "fa:16:3e:45:cc:69", "network": {"id": "3a2796a2-829f-4991-b98e-ca4d599e9eda", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-643093598-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b091548dc7c416c9d07bb6c8cd1edf0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "52c1f5eb-3d4a-4faa-a30d-2b0a46430791", "external-id": "nsx-vlan-transportzone-775", "segmentation_id": 775, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6ac14ff-ac", "ovs_interfaceid": "f6ac14ff-aca4-4e04-a679-594140792ecc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1540.226870] env[68492]: DEBUG oslo_concurrency.lockutils [req-8789b043-a629-4564-a84e-67f2bc01cf70 req-e26b2c7a-b595-41d1-a73c-2438f388c70c service nova] Releasing lock "refresh_cache-66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1540.528469] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1540.528668] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1540.528879] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1542.231799] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1544.231121] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.231560] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.231855] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1545.231855] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1545.253923] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254099] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254230] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254369] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254502] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254625] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254748] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254867] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.254984] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.255115] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1545.255236] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1545.255775] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1545.268034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.268175] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.268273] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.268400] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1545.269490] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5640593-c7f8-4bf9-9c04-03d744bfceef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.278153] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db1bd0b-c07c-44a5-8fc4-f861e994ba65 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.291815] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10ccb435-c68c-4c0b-8032-7bbbc82158cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.297770] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69b73eee-4320-4f7c-a706-861b5d14e292 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.327469] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180954MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1545.327610] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.327795] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.399191] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.399360] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.399492] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.399614] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.399762] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.399897] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.400026] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.400146] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.400261] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.400372] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1545.410874] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 2ffaadba-8144-4c60-b055-95619cd75024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1545.421260] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 0b8f7208-aba6-4411-9ce1-1493367220b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1545.431070] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1545.441414] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8bf43303-71b9-4a37-acfd-1915196b71f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1545.450023] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1545.450243] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1545.450393] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1545.612259] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69414f30-6916-4ef4-b7b6-92741c242bbf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.619903] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e088dd-7f7a-4f90-9fc2-13f606c809c3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.648829] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f81c67-5e72-4853-84ad-3ff643d523da {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.655439] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7065e1d9-1fba-468d-9a0e-f3d7f5c7d531 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.668077] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1545.676453] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1545.689275] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1545.689450] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.362s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.666068] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1547.666399] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.230897] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.230897] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.230897] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1550.231200] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1551.537843] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1573.631475] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1586.346943] env[68492]: WARNING oslo_vmware.rw_handles [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1586.346943] env[68492]: ERROR oslo_vmware.rw_handles [ 1586.347574] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1586.351094] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1586.351391] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Copying Virtual Disk [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/76290a15-c949-41aa-bb86-134419987582/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1586.351698] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9cae2572-bf25-4793-bf93-24437a3445f2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.360470] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for the task: (returnval){ [ 1586.360470] env[68492]: value = "task-3395523" [ 1586.360470] env[68492]: _type = "Task" [ 1586.360470] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1586.368712] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Task: {'id': task-3395523, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1586.873835] env[68492]: DEBUG oslo_vmware.exceptions [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1586.875019] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1586.875019] env[68492]: ERROR nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1586.875019] env[68492]: Faults: ['InvalidArgument'] [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] yield resources [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.driver.spawn(context, instance, image_meta, [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1586.875019] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._fetch_image_if_missing(context, vi) [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] image_cache(vi, tmp_image_ds_loc) [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] vm_util.copy_virtual_disk( [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] session._wait_for_task(vmdk_copy_task) [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.wait_for_task(task_ref) [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return evt.wait() [ 1586.875385] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = hub.switch() [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.greenlet.switch() [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.f(*self.args, **self.kw) [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exceptions.translate_fault(task_info.error) [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Faults: ['InvalidArgument'] [ 1586.875749] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1586.875749] env[68492]: INFO nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Terminating instance [ 1586.876932] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1586.877162] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1586.877404] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fac0d1fb-797b-4fea-a0ca-2924c6a98930 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.879733] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1586.879926] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1586.880667] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52dd356f-a5aa-434e-867a-bf6faa633405 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.887396] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1586.887610] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-225886ac-c5e7-4301-8d5d-061d8e8898b9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.889814] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1586.890013] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1586.890958] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f3e7667-68aa-41a5-a7c3-85da4f42e3ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.895783] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for the task: (returnval){ [ 1586.895783] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527c9ee6-17b8-ff94-a01f-bb5290aae1b6" [ 1586.895783] env[68492]: _type = "Task" [ 1586.895783] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1586.905506] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527c9ee6-17b8-ff94-a01f-bb5290aae1b6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1586.958224] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1586.958224] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1586.958318] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Deleting the datastore file [datastore2] aacdc31e-9a31-4745-b48b-f23a3b16ae9c {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1586.959063] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c1e7c089-2ac1-478b-9274-2f795b74f514 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.965591] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for the task: (returnval){ [ 1586.965591] env[68492]: value = "task-3395525" [ 1586.965591] env[68492]: _type = "Task" [ 1586.965591] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1586.973142] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Task: {'id': task-3395525, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1587.129680] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "aab8759d-db1e-4817-98bf-e1fb45e75640" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1587.129918] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1587.405854] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1587.406205] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Creating directory with path [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1587.406446] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6b9bf199-f4b4-4a47-9cae-84b7d042030d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.418319] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Created directory with path [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1587.418529] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Fetch image to [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1587.418704] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1587.419493] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f6b73eb-8ffb-4c2d-9438-a332576eb70f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.426391] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fed7b7b3-ae04-478a-af94-1525d94d2693 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.435680] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e553c54c-330d-47ea-87c5-4b1fb16f27e5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.471301] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec47d2d-60b3-400e-8131-56220cf623e5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.479120] env[68492]: DEBUG oslo_vmware.api [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Task: {'id': task-3395525, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076482} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1587.480814] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1587.480998] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1587.481225] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1587.481441] env[68492]: INFO nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1587.483653] env[68492]: DEBUG nova.compute.claims [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1587.483857] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1587.484115] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1587.486766] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-515f229d-5a5d-4d9a-9ddd-a3f3d1f8001b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.513161] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1587.635964] env[68492]: DEBUG oslo_vmware.rw_handles [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1587.697406] env[68492]: DEBUG oslo_vmware.rw_handles [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1587.697684] env[68492]: DEBUG oslo_vmware.rw_handles [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1587.759291] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44426d9e-96e8-4c99-b56a-e718c97f1e36 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.767029] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337d23cc-644a-4fca-9254-1af0304e1eb8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.798805] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-870927da-8935-459b-b5d2-62cdc6813458 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.806176] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d298880-ad73-451b-a54c-607f1dbdb468 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1587.819300] env[68492]: DEBUG nova.compute.provider_tree [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1587.828023] env[68492]: DEBUG nova.scheduler.client.report [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1587.842818] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.358s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1587.843138] env[68492]: ERROR nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1587.843138] env[68492]: Faults: ['InvalidArgument'] [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.driver.spawn(context, instance, image_meta, [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._fetch_image_if_missing(context, vi) [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] image_cache(vi, tmp_image_ds_loc) [ 1587.843138] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] vm_util.copy_virtual_disk( [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] session._wait_for_task(vmdk_copy_task) [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.wait_for_task(task_ref) [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return evt.wait() [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = hub.switch() [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.greenlet.switch() [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1587.843552] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.f(*self.args, **self.kw) [ 1587.843920] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1587.843920] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exceptions.translate_fault(task_info.error) [ 1587.843920] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1587.843920] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Faults: ['InvalidArgument'] [ 1587.843920] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.843920] env[68492]: DEBUG nova.compute.utils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1587.845618] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Build of instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c was re-scheduled: A specified parameter was not correct: fileType [ 1587.845618] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1587.846012] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1587.846203] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1587.846352] env[68492]: DEBUG nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1587.846521] env[68492]: DEBUG nova.network.neutron [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1587.949765] env[68492]: DEBUG neutronclient.v2_0.client [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1587.950931] env[68492]: ERROR nova.compute.manager [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.driver.spawn(context, instance, image_meta, [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._fetch_image_if_missing(context, vi) [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] image_cache(vi, tmp_image_ds_loc) [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1587.950931] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] vm_util.copy_virtual_disk( [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] session._wait_for_task(vmdk_copy_task) [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.wait_for_task(task_ref) [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return evt.wait() [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = hub.switch() [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.greenlet.switch() [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.f(*self.args, **self.kw) [ 1587.951316] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exceptions.translate_fault(task_info.error) [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Faults: ['InvalidArgument'] [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] During handling of the above exception, another exception occurred: [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._build_and_run_instance(context, instance, image, [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exception.RescheduledException( [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] nova.exception.RescheduledException: Build of instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c was re-scheduled: A specified parameter was not correct: fileType [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Faults: ['InvalidArgument'] [ 1587.951684] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] During handling of the above exception, another exception occurred: [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] exception_handler_v20(status_code, error_body) [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise client_exc(message=error_message, [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Neutron server returns request_ids: ['req-aa31661b-d6cb-43a4-90e0-07a6138c95bb'] [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.952131] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] During handling of the above exception, another exception occurred: [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._deallocate_network(context, instance, requested_networks) [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.network_api.deallocate_for_instance( [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] data = neutron.list_ports(**search_opts) [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.list('ports', self.ports_path, retrieve_all, [ 1587.952508] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] for r in self._pagination(collection, path, **params): [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] res = self.get(path, params=params) [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.retry_request("GET", action, body=body, [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1587.952897] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.do_request(method, action, body=body, [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._handle_fault_response(status_code, replybody, resp) [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exception.Unauthorized() [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] nova.exception.Unauthorized: Not authorized. [ 1587.953304] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1588.007772] env[68492]: INFO nova.scheduler.client.report [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Deleted allocations for instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c [ 1588.028430] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b7be7b08-ac1b-43e5-a454-f130ce6ae21c tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 607.708s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.029485] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 427.783s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1588.029676] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] During sync_power_state the instance has a pending task (spawning). Skip. [ 1588.030068] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.030501] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 411.426s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1588.030720] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Acquiring lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1588.030924] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1588.031111] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.032848] env[68492]: INFO nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Terminating instance [ 1588.034537] env[68492]: DEBUG nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1588.034659] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1588.034955] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6a85510b-9440-4ad6-bd77-e3e0ece5e574 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.044696] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc12f083-edf7-4884-bff7-619a3a9a3caf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.055227] env[68492]: DEBUG nova.compute.manager [None req-86bf1bc2-b937-4b3d-ba2b-cc6780921a49 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 2ffaadba-8144-4c60-b055-95619cd75024] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1588.077167] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aacdc31e-9a31-4745-b48b-f23a3b16ae9c could not be found. [ 1588.077384] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1588.077563] env[68492]: INFO nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1588.077858] env[68492]: DEBUG oslo.service.loopingcall [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1588.078135] env[68492]: DEBUG nova.compute.manager [-] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1588.078234] env[68492]: DEBUG nova.network.neutron [-] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1588.080941] env[68492]: DEBUG nova.compute.manager [None req-86bf1bc2-b937-4b3d-ba2b-cc6780921a49 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 2ffaadba-8144-4c60-b055-95619cd75024] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1588.104735] env[68492]: DEBUG oslo_concurrency.lockutils [None req-86bf1bc2-b937-4b3d-ba2b-cc6780921a49 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "2ffaadba-8144-4c60-b055-95619cd75024" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.431s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.115763] env[68492]: DEBUG nova.compute.manager [None req-564de4ee-9385-4996-b9a1-651a0a78f64d tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 0b8f7208-aba6-4411-9ce1-1493367220b0] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1588.142181] env[68492]: DEBUG nova.compute.manager [None req-564de4ee-9385-4996-b9a1-651a0a78f64d tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 0b8f7208-aba6-4411-9ce1-1493367220b0] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1588.163179] env[68492]: DEBUG oslo_concurrency.lockutils [None req-564de4ee-9385-4996-b9a1-651a0a78f64d tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "0b8f7208-aba6-4411-9ce1-1493367220b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.177s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.173617] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1588.183601] env[68492]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68492) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1588.183840] env[68492]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-88d2dccb-b2e9-4c97-ae94-8cbc201ef91a'] [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1588.184366] env[68492]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1588.185023] env[68492]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.185825] env[68492]: ERROR oslo.service.loopingcall [ 1588.186331] env[68492]: ERROR nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.214046] env[68492]: ERROR nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] exception_handler_v20(status_code, error_body) [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise client_exc(message=error_message, [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Neutron server returns request_ids: ['req-88d2dccb-b2e9-4c97-ae94-8cbc201ef91a'] [ 1588.214046] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] During handling of the above exception, another exception occurred: [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Traceback (most recent call last): [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._delete_instance(context, instance, bdms) [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._shutdown_instance(context, instance, bdms) [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._try_deallocate_network(context, instance, requested_networks) [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] with excutils.save_and_reraise_exception(): [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.214411] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.force_reraise() [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise self.value [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] _deallocate_network_with_retries() [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return evt.wait() [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = hub.switch() [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.greenlet.switch() [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1588.214874] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = func(*self.args, **self.kw) [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] result = f(*args, **kwargs) [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._deallocate_network( [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self.network_api.deallocate_for_instance( [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] data = neutron.list_ports(**search_opts) [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.list('ports', self.ports_path, retrieve_all, [ 1588.215246] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] for r in self._pagination(collection, path, **params): [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] res = self.get(path, params=params) [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.retry_request("GET", action, body=body, [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1588.215622] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] return self.do_request(method, action, body=body, [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] ret = obj(*args, **kwargs) [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] self._handle_fault_response(status_code, replybody, resp) [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.215980] env[68492]: ERROR nova.compute.manager [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] [ 1588.223186] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1588.223437] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1588.225348] env[68492]: INFO nova.compute.claims [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1588.241271] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Lock "aacdc31e-9a31-4745-b48b-f23a3b16ae9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.211s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.292507] env[68492]: INFO nova.compute.manager [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] [instance: aacdc31e-9a31-4745-b48b-f23a3b16ae9c] Successfully reverted task state from None on failure for instance. [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server [None req-c03e7b04-eb8d-4816-bddb-e98670f9b8ce tempest-TenantUsagesTestJSON-1656939724 tempest-TenantUsagesTestJSON-1656939724-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-88d2dccb-b2e9-4c97-ae94-8cbc201ef91a'] [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1588.296021] env[68492]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.296520] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1453, in decorated_function [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1588.297046] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server raise self.value [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1588.297567] env[68492]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.298191] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1588.298752] env[68492]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1588.299281] env[68492]: ERROR oslo_messaging.rpc.server [ 1588.422325] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8a0dab6-6b33-4fe9-89fa-704295f45f04 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.430363] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0e4b50-d8f8-44b4-a498-c0bf00726b53 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.460899] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2210e60-d80d-4cd1-953f-14a8d6fd790d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.467707] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae79320-755e-4da7-80b4-2e074475aef5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.480803] env[68492]: DEBUG nova.compute.provider_tree [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1588.490727] env[68492]: DEBUG nova.scheduler.client.report [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1588.504151] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1588.504594] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1588.538438] env[68492]: DEBUG nova.compute.utils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1588.540503] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1588.540675] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1588.551329] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1588.601099] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1588.619244] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1588.636232] env[68492]: DEBUG nova.policy [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c1394a2cfbd4895a43beeac0537d300', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '742d1d3d024340819f586a7cc267d224', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1588.646276] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1588.646508] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1588.646666] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1588.646845] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1588.646992] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1588.647164] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1588.647368] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1588.647526] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1588.647745] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1588.647890] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1588.648119] env[68492]: DEBUG nova.virt.hardware [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1588.648975] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afbb0545-5a77-4a4c-b626-a2bfc6d53d67 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1588.656838] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948e9469-d32e-40e3-a18d-6c1937c1a4e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1589.006211] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Successfully created port: db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1589.577804] env[68492]: DEBUG nova.compute.manager [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Received event network-vif-plugged-db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1589.578084] env[68492]: DEBUG oslo_concurrency.lockutils [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] Acquiring lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1589.578692] env[68492]: DEBUG oslo_concurrency.lockutils [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1589.578906] env[68492]: DEBUG oslo_concurrency.lockutils [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1589.579336] env[68492]: DEBUG nova.compute.manager [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] No waiting events found dispatching network-vif-plugged-db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1589.579573] env[68492]: WARNING nova.compute.manager [req-89770de1-578d-4d74-abc1-5722e0615f01 req-64d4576f-959e-4835-9e44-dc26c6847cc3 service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Received unexpected event network-vif-plugged-db9903a3-7d44-4d0b-a156-00024776214e for instance with vm_state building and task_state spawning. [ 1589.661223] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Successfully updated port: db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1589.676046] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1589.676195] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquired lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1589.676332] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1589.759382] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1590.279613] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updating instance_info_cache with network_info: [{"id": "db9903a3-7d44-4d0b-a156-00024776214e", "address": "fa:16:3e:6b:3d:6a", "network": {"id": "216a9926-ce9c-40c6-bfae-abf4897ea16e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1663398245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "742d1d3d024340819f586a7cc267d224", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6076d24d-3c8e-4bbb-ba96-a08fb27a73cc", "external-id": "nsx-vlan-transportzone-267", "segmentation_id": 267, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb9903a3-7d", "ovs_interfaceid": "db9903a3-7d44-4d0b-a156-00024776214e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1590.292272] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Releasing lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1590.292551] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance network_info: |[{"id": "db9903a3-7d44-4d0b-a156-00024776214e", "address": "fa:16:3e:6b:3d:6a", "network": {"id": "216a9926-ce9c-40c6-bfae-abf4897ea16e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1663398245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "742d1d3d024340819f586a7cc267d224", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6076d24d-3c8e-4bbb-ba96-a08fb27a73cc", "external-id": "nsx-vlan-transportzone-267", "segmentation_id": 267, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb9903a3-7d", "ovs_interfaceid": "db9903a3-7d44-4d0b-a156-00024776214e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1590.292930] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:3d:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6076d24d-3c8e-4bbb-ba96-a08fb27a73cc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'db9903a3-7d44-4d0b-a156-00024776214e', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1590.300693] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Creating folder: Project (742d1d3d024340819f586a7cc267d224). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1590.301298] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-88f2556f-9d2c-4c2d-96e2-1660a35c8232 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1590.314881] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Created folder: Project (742d1d3d024340819f586a7cc267d224) in parent group-v677434. [ 1590.315120] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Creating folder: Instances. Parent ref: group-v677537. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1590.315416] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-39836a32-52f1-4639-a2e7-4b28debe78bb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1590.324646] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Created folder: Instances in parent group-v677537. [ 1590.324935] env[68492]: DEBUG oslo.service.loopingcall [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1590.325180] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1590.325427] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b928391c-8aed-4fe4-add2-28b270ce7d05 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1590.345083] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1590.345083] env[68492]: value = "task-3395528" [ 1590.345083] env[68492]: _type = "Task" [ 1590.345083] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1590.352649] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395528, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1590.855553] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395528, 'name': CreateVM_Task, 'duration_secs': 0.283442} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1590.855927] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1590.856637] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1590.856845] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1590.857211] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1590.857493] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b167ea7-120f-405b-b7d9-312a4bc8617f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1590.862143] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for the task: (returnval){ [ 1590.862143] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52bf882f-fff7-f7ca-c724-88c77a05e786" [ 1590.862143] env[68492]: _type = "Task" [ 1590.862143] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1590.871625] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52bf882f-fff7-f7ca-c724-88c77a05e786, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1591.371800] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1591.372078] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1591.372377] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1591.604024] env[68492]: DEBUG nova.compute.manager [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Received event network-changed-db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1591.604227] env[68492]: DEBUG nova.compute.manager [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Refreshing instance network info cache due to event network-changed-db9903a3-7d44-4d0b-a156-00024776214e. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1591.604474] env[68492]: DEBUG oslo_concurrency.lockutils [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] Acquiring lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1591.604572] env[68492]: DEBUG oslo_concurrency.lockutils [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] Acquired lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1591.604734] env[68492]: DEBUG nova.network.neutron [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Refreshing network info cache for port db9903a3-7d44-4d0b-a156-00024776214e {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1591.831709] env[68492]: DEBUG nova.network.neutron [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updated VIF entry in instance network info cache for port db9903a3-7d44-4d0b-a156-00024776214e. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1591.832074] env[68492]: DEBUG nova.network.neutron [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updating instance_info_cache with network_info: [{"id": "db9903a3-7d44-4d0b-a156-00024776214e", "address": "fa:16:3e:6b:3d:6a", "network": {"id": "216a9926-ce9c-40c6-bfae-abf4897ea16e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1663398245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "742d1d3d024340819f586a7cc267d224", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6076d24d-3c8e-4bbb-ba96-a08fb27a73cc", "external-id": "nsx-vlan-transportzone-267", "segmentation_id": 267, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb9903a3-7d", "ovs_interfaceid": "db9903a3-7d44-4d0b-a156-00024776214e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1591.840998] env[68492]: DEBUG oslo_concurrency.lockutils [req-a55ffb05-56fd-4f1c-95d6-e081ab39fb81 req-d75f19b4-99d8-4613-aba5-43cc562eef3c service nova] Releasing lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1593.534873] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "e6c9ab71-8507-4238-9936-fd9a61101313" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1593.535211] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.231736] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1604.226566] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1604.249602] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1605.231593] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1605.231738] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1605.231852] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1605.252349] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.252618] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.252664] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.252755] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.252880] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253011] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253137] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253257] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253374] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253489] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1605.253604] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1607.231033] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1607.243345] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.243567] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.243731] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1607.243889] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1607.245073] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bce5b805-0c0e-4aa0-af7a-359c2677bbb4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.253988] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8b407c2-4801-4f8f-a8d4-eded6e9f0046 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.267946] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b70a5c-6d83-4823-b7fd-360f3baa333e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.274313] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94b64157-a86f-4ff5-bf9e-05f8763fb3ac {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.302983] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180971MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1607.303155] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1607.303459] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1607.376985] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377169] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377300] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377422] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377541] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377659] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377820] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.377976] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.378070] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.378176] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1607.390089] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 8bf43303-71b9-4a37-acfd-1915196b71f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1607.400962] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1607.410512] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1607.419825] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1607.420091] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1607.420252] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1607.582025] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44ef4fcc-d123-4e78-bffc-24869762b741 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.591021] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b896657-4296-426e-b82c-5cf6f463b41b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.622076] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad4e7d61-eafd-4adb-b084-5b10e9a37bcd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.629532] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e7b84e-10bb-45cb-bcbd-5a729c7fd0a0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1607.642820] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1607.652209] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1607.666686] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1607.666862] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1608.668066] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1608.668425] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1609.226756] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1609.230420] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1609.230575] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1611.232831] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1631.522402] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.522402] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1633.041150] env[68492]: WARNING oslo_vmware.rw_handles [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1633.041150] env[68492]: ERROR oslo_vmware.rw_handles [ 1633.042101] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1633.043844] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1633.044195] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Copying Virtual Disk [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/b122133a-c866-4fa7-aa81-e97c8242fe73/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1633.044533] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-07807012-cbbb-45a6-a4cc-823e095ecc01 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.052889] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for the task: (returnval){ [ 1633.052889] env[68492]: value = "task-3395529" [ 1633.052889] env[68492]: _type = "Task" [ 1633.052889] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1633.060385] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Task: {'id': task-3395529, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1633.564775] env[68492]: DEBUG oslo_vmware.exceptions [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1633.565063] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1633.565614] env[68492]: ERROR nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1633.565614] env[68492]: Faults: ['InvalidArgument'] [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Traceback (most recent call last): [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] yield resources [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self.driver.spawn(context, instance, image_meta, [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self._fetch_image_if_missing(context, vi) [ 1633.565614] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] image_cache(vi, tmp_image_ds_loc) [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] vm_util.copy_virtual_disk( [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] session._wait_for_task(vmdk_copy_task) [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return self.wait_for_task(task_ref) [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return evt.wait() [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] result = hub.switch() [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1633.566070] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return self.greenlet.switch() [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self.f(*self.args, **self.kw) [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] raise exceptions.translate_fault(task_info.error) [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Faults: ['InvalidArgument'] [ 1633.566497] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] [ 1633.566497] env[68492]: INFO nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Terminating instance [ 1633.567521] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1633.567624] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1633.568272] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1633.568462] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1633.568684] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5efa6eda-bb14-428a-a4da-67be7bf70448 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.571018] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b08c36c4-0a39-4fe9-975d-958c50c3bc4f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.578196] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1633.578414] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9e21dbef-72b8-44c7-a254-8f60e87ceeec {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.580520] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1633.580698] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1633.581707] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dcd0fc95-aa88-45aa-958e-1275adef48d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.586263] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1633.586263] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522026e0-29b4-9f97-fc1e-59b0b116937e" [ 1633.586263] env[68492]: _type = "Task" [ 1633.586263] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1633.594443] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]522026e0-29b4-9f97-fc1e-59b0b116937e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1633.646795] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1633.647096] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1633.647327] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Deleting the datastore file [datastore2] 685c54e1-5251-4ea2-a4bb-fcdafe9d270c {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1633.647640] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-61ff1e78-dfb7-4999-be31-2809a98ee133 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.655016] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for the task: (returnval){ [ 1633.655016] env[68492]: value = "task-3395531" [ 1633.655016] env[68492]: _type = "Task" [ 1633.655016] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1633.663415] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Task: {'id': task-3395531, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1634.096363] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1634.096649] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating directory with path [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1634.096851] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7140b0a6-2f44-45be-ae16-c312a289d0e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.107942] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Created directory with path [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1634.108168] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Fetch image to [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1634.108343] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1634.109042] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b412299-f7d3-4ccb-972e-09cad8b7f049 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.115181] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43611407-cde9-4133-84db-51021d6a26d8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.123894] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3071888-59c5-49c1-901c-4d78d47aa468 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.153132] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ecfa3de-39ee-4826-b63f-859b81e30626 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.162392] env[68492]: DEBUG oslo_vmware.api [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Task: {'id': task-3395531, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074828} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1634.163703] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1634.163888] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1634.164072] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1634.164249] env[68492]: INFO nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1634.165945] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-efafd4b1-fd74-4bb3-8309-0e5bea3ecf41 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.167747] env[68492]: DEBUG nova.compute.claims [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1634.167919] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1634.168167] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1634.189735] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1634.339711] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1634.400032] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1634.400135] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1634.443800] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57fe2548-424d-40a6-b94c-8cb4ea7be15e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.450969] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-609f0abe-5c37-4ce8-a8df-353c364176be {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.481029] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04996993-ecad-43bb-b15f-9f2f4d869a74 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.489086] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3135b93f-5e39-4a2e-ab6c-2e62a7d0b137 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1634.503583] env[68492]: DEBUG nova.compute.provider_tree [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1634.512798] env[68492]: DEBUG nova.scheduler.client.report [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1634.528147] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.360s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1634.528683] env[68492]: ERROR nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1634.528683] env[68492]: Faults: ['InvalidArgument'] [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Traceback (most recent call last): [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self.driver.spawn(context, instance, image_meta, [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self._fetch_image_if_missing(context, vi) [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] image_cache(vi, tmp_image_ds_loc) [ 1634.528683] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] vm_util.copy_virtual_disk( [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] session._wait_for_task(vmdk_copy_task) [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return self.wait_for_task(task_ref) [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return evt.wait() [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] result = hub.switch() [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] return self.greenlet.switch() [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1634.529091] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] self.f(*self.args, **self.kw) [ 1634.529405] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1634.529405] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] raise exceptions.translate_fault(task_info.error) [ 1634.529405] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1634.529405] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Faults: ['InvalidArgument'] [ 1634.529405] env[68492]: ERROR nova.compute.manager [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] [ 1634.529534] env[68492]: DEBUG nova.compute.utils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1634.530842] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Build of instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c was re-scheduled: A specified parameter was not correct: fileType [ 1634.530842] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1634.531221] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1634.531396] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1634.531560] env[68492]: DEBUG nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1634.531720] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1635.038749] env[68492]: DEBUG nova.network.neutron [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1635.051621] env[68492]: INFO nova.compute.manager [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Took 0.52 seconds to deallocate network for instance. [ 1635.147586] env[68492]: INFO nova.scheduler.client.report [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Deleted allocations for instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c [ 1635.166530] env[68492]: DEBUG oslo_concurrency.lockutils [None req-cb691562-8362-4590-8c51-2e7aeb0ce8a6 tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.004s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1635.167646] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.853s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1635.167873] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Acquiring lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1635.168152] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1635.168330] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1635.170527] env[68492]: INFO nova.compute.manager [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Terminating instance [ 1635.172193] env[68492]: DEBUG nova.compute.manager [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1635.172385] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1635.172850] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2cccd48e-8634-4795-b0f0-7fbbb21e4290 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.183663] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65a4b09a-41d6-41f3-a41a-f6cd4054338d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.194047] env[68492]: DEBUG nova.compute.manager [None req-552b44ee-eb55-4493-bf5a-dac02867570d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: 8bf43303-71b9-4a37-acfd-1915196b71f4] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1635.215237] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 685c54e1-5251-4ea2-a4bb-fcdafe9d270c could not be found. [ 1635.215462] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1635.215641] env[68492]: INFO nova.compute.manager [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1635.215885] env[68492]: DEBUG oslo.service.loopingcall [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1635.216173] env[68492]: DEBUG nova.compute.manager [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1635.216275] env[68492]: DEBUG nova.network.neutron [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1635.238325] env[68492]: DEBUG nova.compute.manager [None req-552b44ee-eb55-4493-bf5a-dac02867570d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] [instance: 8bf43303-71b9-4a37-acfd-1915196b71f4] Instance disappeared before build. {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1635.242633] env[68492]: DEBUG nova.network.neutron [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1635.255418] env[68492]: INFO nova.compute.manager [-] [instance: 685c54e1-5251-4ea2-a4bb-fcdafe9d270c] Took 0.04 seconds to deallocate network for instance. [ 1635.257731] env[68492]: DEBUG oslo_concurrency.lockutils [None req-552b44ee-eb55-4493-bf5a-dac02867570d tempest-ImagesTestJSON-368871249 tempest-ImagesTestJSON-368871249-project-member] Lock "8bf43303-71b9-4a37-acfd-1915196b71f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.249s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1635.266113] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1635.316846] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1635.317107] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1635.318628] env[68492]: INFO nova.compute.claims [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1635.353940] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e1c6656d-9bf7-466a-8421-30fdbc5f433f tempest-ServerRescueTestJSONUnderV235-1368973626 tempest-ServerRescueTestJSONUnderV235-1368973626-project-member] Lock "685c54e1-5251-4ea2-a4bb-fcdafe9d270c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.186s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1635.522990] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-186a2b9c-7b31-4ab2-b292-063108b7f618 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.531275] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf162edb-6cbc-4e61-90eb-c967f8447131 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.561737] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6b84a06-34e6-4834-a05c-854df0a602c0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.568792] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f25c881-f077-48a4-9b79-8d122ddbf32f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.581952] env[68492]: DEBUG nova.compute.provider_tree [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1635.590580] env[68492]: DEBUG nova.scheduler.client.report [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1635.603827] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1635.604311] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1635.640512] env[68492]: DEBUG nova.compute.utils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1635.641710] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1635.641877] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1635.651522] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1635.714479] env[68492]: DEBUG nova.policy [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a6986ddc4824f98898c9348a88eb2fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3be38f7dedc4da3886f7278f14176b8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1635.717762] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1635.742782] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1635.743016] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1635.743182] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1635.743367] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1635.743511] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1635.743655] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1635.743863] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1635.744030] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1635.744204] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1635.744362] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1635.744527] env[68492]: DEBUG nova.virt.hardware [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1635.745397] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cdf5c56-0224-41e5-97f2-cf66f6393ce9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1635.755428] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a70dd88e-e0ab-43f2-a041-f175429fcc2c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1636.296672] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Successfully created port: 6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1637.053696] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Successfully updated port: 6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1637.065571] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1637.065731] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquired lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1637.065882] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1637.078674] env[68492]: DEBUG nova.compute.manager [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Received event network-vif-plugged-6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1637.078876] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Acquiring lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1637.079088] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1637.079252] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1637.079486] env[68492]: DEBUG nova.compute.manager [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] No waiting events found dispatching network-vif-plugged-6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1637.079698] env[68492]: WARNING nova.compute.manager [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Received unexpected event network-vif-plugged-6f3f6c27-5a94-4151-a26c-95876e9649b3 for instance with vm_state building and task_state spawning. [ 1637.079861] env[68492]: DEBUG nova.compute.manager [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Received event network-changed-6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1637.080024] env[68492]: DEBUG nova.compute.manager [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Refreshing instance network info cache due to event network-changed-6f3f6c27-5a94-4151-a26c-95876e9649b3. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1637.080192] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Acquiring lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1637.104304] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1637.254533] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Updating instance_info_cache with network_info: [{"id": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "address": "fa:16:3e:67:8c:da", "network": {"id": "c3179d21-3aa4-4a21-aa5f-6c0f5821c2b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1333344811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d3be38f7dedc4da3886f7278f14176b8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d33839ae-40ca-471b-92e3-eb282b920682", "external-id": "nsx-vlan-transportzone-416", "segmentation_id": 416, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f3f6c27-5a", "ovs_interfaceid": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1637.265115] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Releasing lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1637.265389] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance network_info: |[{"id": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "address": "fa:16:3e:67:8c:da", "network": {"id": "c3179d21-3aa4-4a21-aa5f-6c0f5821c2b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1333344811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d3be38f7dedc4da3886f7278f14176b8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d33839ae-40ca-471b-92e3-eb282b920682", "external-id": "nsx-vlan-transportzone-416", "segmentation_id": 416, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f3f6c27-5a", "ovs_interfaceid": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1637.265675] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Acquired lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1637.265850] env[68492]: DEBUG nova.network.neutron [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Refreshing network info cache for port 6f3f6c27-5a94-4151-a26c-95876e9649b3 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1637.266821] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:8c:da', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd33839ae-40ca-471b-92e3-eb282b920682', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6f3f6c27-5a94-4151-a26c-95876e9649b3', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1637.274499] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Creating folder: Project (d3be38f7dedc4da3886f7278f14176b8). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1637.275301] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ad792dd7-4588-4957-b7c8-06d0125f50d7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.288626] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Created folder: Project (d3be38f7dedc4da3886f7278f14176b8) in parent group-v677434. [ 1637.288821] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Creating folder: Instances. Parent ref: group-v677540. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1637.289074] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-288cd17f-c267-4538-9337-0792abaf6335 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.297541] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Created folder: Instances in parent group-v677540. [ 1637.297798] env[68492]: DEBUG oslo.service.loopingcall [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1637.297943] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1637.298170] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7da2162b-a465-4013-b2b9-1a0fd41bec46 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.319187] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1637.319187] env[68492]: value = "task-3395534" [ 1637.319187] env[68492]: _type = "Task" [ 1637.319187] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1637.326566] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395534, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1637.632507] env[68492]: DEBUG nova.network.neutron [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Updated VIF entry in instance network info cache for port 6f3f6c27-5a94-4151-a26c-95876e9649b3. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1637.632874] env[68492]: DEBUG nova.network.neutron [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Updating instance_info_cache with network_info: [{"id": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "address": "fa:16:3e:67:8c:da", "network": {"id": "c3179d21-3aa4-4a21-aa5f-6c0f5821c2b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1333344811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d3be38f7dedc4da3886f7278f14176b8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d33839ae-40ca-471b-92e3-eb282b920682", "external-id": "nsx-vlan-transportzone-416", "segmentation_id": 416, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6f3f6c27-5a", "ovs_interfaceid": "6f3f6c27-5a94-4151-a26c-95876e9649b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1637.641945] env[68492]: DEBUG oslo_concurrency.lockutils [req-0376cce2-2c80-4761-b9b5-6190782af474 req-1a6bc53a-ee2c-4b0f-906d-faf084f035e8 service nova] Releasing lock "refresh_cache-a90e989d-6aef-482f-b767-8dbdd7f29628" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1637.829823] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395534, 'name': CreateVM_Task} progress is 25%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1638.330496] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395534, 'name': CreateVM_Task} progress is 25%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1638.831089] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395534, 'name': CreateVM_Task, 'duration_secs': 1.266972} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1638.831273] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1638.831897] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1638.832081] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1638.832398] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1638.832644] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8f3137fa-a979-498a-95af-79eb9b37ae9d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.836798] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for the task: (returnval){ [ 1638.836798] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52af8d4b-f1d3-fce6-0429-522aa11c2766" [ 1638.836798] env[68492]: _type = "Task" [ 1638.836798] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1638.844286] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52af8d4b-f1d3-fce6-0429-522aa11c2766, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1639.347566] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1639.347892] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1639.348095] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1640.650614] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "a9111481-6ba1-4d76-bce9-8db609eb704d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1640.650909] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.827613] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "a90e989d-6aef-482f-b767-8dbdd7f29628" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1664.230657] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.231798] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1666.233161] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1666.233161] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1666.233161] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1666.253637] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.253851] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254026] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254194] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254346] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254495] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254644] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254789] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.254931] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.255101] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1666.255258] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1667.231712] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1667.243659] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1667.243962] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1667.244065] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1667.244211] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1667.245361] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bc7e556-d344-4c97-9a46-03833c37776d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.254576] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d9d787-c693-4132-a89b-108bdeeb26a4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.269539] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4ea856d-7b09-4e0b-a18b-de1e19d66604 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.276729] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d31a4339-50e5-4351-955f-4ec8e56d4f98 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.305505] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180955MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1667.305667] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1667.305844] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1667.386371] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.386534] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.386655] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.386775] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.386892] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.387017] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.387144] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.387260] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.387373] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.387487] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1667.399844] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1667.410368] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1667.420571] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1667.430368] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1667.431032] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1667.431032] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1667.582372] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bccbf442-31f4-4980-8c57-8e121e872fa8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.589643] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064f406a-0247-4cc0-8a2b-b27875f4dd6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.620293] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3134f09b-aa19-4e2d-a5bb-88994bb1ae99 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.627645] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be61cced-8d65-454d-8a00-01359b632dfa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1667.640483] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1667.649716] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1667.662663] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1667.662846] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1669.662601] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.662910] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.662996] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1670.226688] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1670.230388] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1672.230370] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.721358] env[68492]: WARNING oslo_vmware.rw_handles [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1680.721358] env[68492]: ERROR oslo_vmware.rw_handles [ 1680.722018] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1680.723950] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1680.724217] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Copying Virtual Disk [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/b138e6d1-1f0f-493e-af83-13375eede285/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1680.724515] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f995a550-c007-4286-b3b4-d066bf29b865 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.733178] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1680.733178] env[68492]: value = "task-3395535" [ 1680.733178] env[68492]: _type = "Task" [ 1680.733178] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1680.740739] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395535, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.245393] env[68492]: DEBUG oslo_vmware.exceptions [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1681.245669] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1681.246276] env[68492]: ERROR nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1681.246276] env[68492]: Faults: ['InvalidArgument'] [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Traceback (most recent call last): [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] yield resources [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self.driver.spawn(context, instance, image_meta, [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self._fetch_image_if_missing(context, vi) [ 1681.246276] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] image_cache(vi, tmp_image_ds_loc) [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] vm_util.copy_virtual_disk( [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] session._wait_for_task(vmdk_copy_task) [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return self.wait_for_task(task_ref) [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return evt.wait() [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] result = hub.switch() [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1681.246927] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return self.greenlet.switch() [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self.f(*self.args, **self.kw) [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] raise exceptions.translate_fault(task_info.error) [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Faults: ['InvalidArgument'] [ 1681.247374] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] [ 1681.247374] env[68492]: INFO nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Terminating instance [ 1681.248231] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.248352] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1681.248538] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2803deec-dd49-4332-a871-3d9e7b140c27 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.250644] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1681.250852] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1681.251569] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f6d5ba9-9f1d-49eb-8f04-5e043d610d68 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.258176] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1681.258484] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3581df0-93f2-4806-8a0c-b1f8d7814d3e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.260559] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1681.260756] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1681.261727] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc81120a-5122-41e9-814b-090814f1984a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.266193] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for the task: (returnval){ [ 1681.266193] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52046992-48e6-65c2-36a8-babcb6f605a6" [ 1681.266193] env[68492]: _type = "Task" [ 1681.266193] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.276132] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52046992-48e6-65c2-36a8-babcb6f605a6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.329452] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1681.329655] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1681.329814] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleting the datastore file [datastore2] e1c7c4bb-fb65-450c-8c28-11ccf986fe94 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1681.330115] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-35ba97e9-0d0c-47b7-8aea-1e37cfdea0ad {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.335766] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for the task: (returnval){ [ 1681.335766] env[68492]: value = "task-3395537" [ 1681.335766] env[68492]: _type = "Task" [ 1681.335766] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.342993] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395537, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.776321] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1681.776599] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Creating directory with path [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1681.776831] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4679dd7-6cae-490a-a76e-43efe7180d0d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.788509] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Created directory with path [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1681.790036] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Fetch image to [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1681.790036] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1681.790036] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db099fac-58d0-4973-b3c5-0dc5f88d82a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.796133] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4274b775-ce94-4db9-af9e-9ff045903c69 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.804986] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe0074ff-63fa-4702-9327-d41957a189ef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.835943] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac1bd21d-61c2-48bc-b3f6-d724b5abaa7b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.847059] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aea9f484-1e60-4dec-9cac-d2f447bed54e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.848759] env[68492]: DEBUG oslo_vmware.api [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Task: {'id': task-3395537, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06709} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1681.849009] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1681.849202] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1681.849419] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1681.849620] env[68492]: INFO nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1681.851915] env[68492]: DEBUG nova.compute.claims [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1681.852096] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1681.852323] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1681.870893] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1681.990386] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1682.048883] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1682.049116] env[68492]: DEBUG oslo_vmware.rw_handles [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1682.106495] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae02fe6-c1aa-4aa5-9338-d90c5338e8bb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.114075] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f65d2e-fef4-420c-8f23-4c34a1bd37c4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.144549] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f18e0ee0-d21a-4e27-8a4b-03328e08a41f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.151788] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69097758-2a8b-4e57-952c-c8c3444bb6f1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.164728] env[68492]: DEBUG nova.compute.provider_tree [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1682.173661] env[68492]: DEBUG nova.scheduler.client.report [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1682.190060] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.337s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.190175] env[68492]: ERROR nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1682.190175] env[68492]: Faults: ['InvalidArgument'] [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Traceback (most recent call last): [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self.driver.spawn(context, instance, image_meta, [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self._fetch_image_if_missing(context, vi) [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] image_cache(vi, tmp_image_ds_loc) [ 1682.190175] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] vm_util.copy_virtual_disk( [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] session._wait_for_task(vmdk_copy_task) [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return self.wait_for_task(task_ref) [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return evt.wait() [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] result = hub.switch() [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] return self.greenlet.switch() [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1682.190499] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] self.f(*self.args, **self.kw) [ 1682.190858] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1682.190858] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] raise exceptions.translate_fault(task_info.error) [ 1682.190858] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1682.190858] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Faults: ['InvalidArgument'] [ 1682.190858] env[68492]: ERROR nova.compute.manager [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] [ 1682.190858] env[68492]: DEBUG nova.compute.utils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1682.192169] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Build of instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 was re-scheduled: A specified parameter was not correct: fileType [ 1682.192169] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1682.192530] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1682.192700] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1682.192877] env[68492]: DEBUG nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1682.193049] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1682.494895] env[68492]: DEBUG nova.network.neutron [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1682.506326] env[68492]: INFO nova.compute.manager [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Took 0.31 seconds to deallocate network for instance. [ 1682.605248] env[68492]: INFO nova.scheduler.client.report [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Deleted allocations for instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 [ 1682.625284] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bae56af2-a7f7-4214-aaa4-a202c83fd978 tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.251s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.626397] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.220s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.626612] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Acquiring lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.627258] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.627258] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.629038] env[68492]: INFO nova.compute.manager [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Terminating instance [ 1682.631131] env[68492]: DEBUG nova.compute.manager [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1682.631422] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1682.631801] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d0c1f544-4fdc-46fd-8784-57e880c98a52 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.637327] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1682.643581] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6620e35f-0524-4e73-b864-fe67a21f4dbf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.672508] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e1c7c4bb-fb65-450c-8c28-11ccf986fe94 could not be found. [ 1682.672714] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1682.672889] env[68492]: INFO nova.compute.manager [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1682.673151] env[68492]: DEBUG oslo.service.loopingcall [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1682.675328] env[68492]: DEBUG nova.compute.manager [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1682.675454] env[68492]: DEBUG nova.network.neutron [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1682.691045] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.691298] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.692744] env[68492]: INFO nova.compute.claims [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1682.716480] env[68492]: DEBUG nova.network.neutron [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1682.727852] env[68492]: INFO nova.compute.manager [-] [instance: e1c7c4bb-fb65-450c-8c28-11ccf986fe94] Took 0.05 seconds to deallocate network for instance. [ 1682.825184] env[68492]: DEBUG oslo_concurrency.lockutils [None req-8195e554-f11b-48fc-91b4-da1ca665cafe tempest-SecurityGroupsTestJSON-1867999903 tempest-SecurityGroupsTestJSON-1867999903-project-member] Lock "e1c7c4bb-fb65-450c-8c28-11ccf986fe94" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.927658] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ee41d30-f254-453d-a69c-17d25d9622c3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.935187] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-babedcd6-cd9c-4eef-9cf8-492b72663b0a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.965669] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f18c39ef-f482-4325-ba40-259e8c3804a3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.973096] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23262d09-44cc-4687-885c-21b16c659c6d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.986164] env[68492]: DEBUG nova.compute.provider_tree [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1682.994779] env[68492]: DEBUG nova.scheduler.client.report [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1683.010359] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1683.010828] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1683.040629] env[68492]: DEBUG nova.compute.utils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1683.042043] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1683.042138] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1683.050075] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1683.096796] env[68492]: DEBUG nova.policy [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7bf86f7359545ebbf45a5a002c88e5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839d10b6a7894af08ca3717477bcd473', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1683.110253] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1683.134081] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1683.134331] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1683.134556] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1683.134745] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1683.134893] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1683.135050] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1683.135269] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1683.135433] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1683.135596] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1683.135757] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1683.135926] env[68492]: DEBUG nova.virt.hardware [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1683.136783] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c3b2d54-78d7-4c80-ae93-cffad42979a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.144754] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85f1d485-d128-44d0-805a-d605de0e9cd7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.501091] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Successfully created port: 7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1684.276138] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Successfully updated port: 7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1684.289121] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1684.289268] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1684.289476] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1684.322385] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1684.555774] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Updating instance_info_cache with network_info: [{"id": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "address": "fa:16:3e:21:b1:3b", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7031d064-03", "ovs_interfaceid": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1684.569232] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1684.569573] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance network_info: |[{"id": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "address": "fa:16:3e:21:b1:3b", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7031d064-03", "ovs_interfaceid": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1684.569973] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:21:b1:3b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '310b8ba9-edca-4135-863e-f4a786dd4a77', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7031d064-0372-4f2d-bc3e-47903b00cb6e', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1684.577687] env[68492]: DEBUG oslo.service.loopingcall [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1684.578218] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1684.578523] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-47e8e4cb-a23e-4ce5-99ad-5dc3757f6400 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.600128] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1684.600128] env[68492]: value = "task-3395538" [ 1684.600128] env[68492]: _type = "Task" [ 1684.600128] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1684.610644] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395538, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1684.627871] env[68492]: DEBUG nova.compute.manager [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Received event network-vif-plugged-7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1684.627871] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Acquiring lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1684.627871] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1684.627871] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1684.628044] env[68492]: DEBUG nova.compute.manager [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] No waiting events found dispatching network-vif-plugged-7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1684.628164] env[68492]: WARNING nova.compute.manager [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Received unexpected event network-vif-plugged-7031d064-0372-4f2d-bc3e-47903b00cb6e for instance with vm_state building and task_state spawning. [ 1684.628359] env[68492]: DEBUG nova.compute.manager [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Received event network-changed-7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1684.628527] env[68492]: DEBUG nova.compute.manager [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Refreshing instance network info cache due to event network-changed-7031d064-0372-4f2d-bc3e-47903b00cb6e. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1684.628705] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Acquiring lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1684.628838] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Acquired lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1684.628990] env[68492]: DEBUG nova.network.neutron [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Refreshing network info cache for port 7031d064-0372-4f2d-bc3e-47903b00cb6e {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1684.870368] env[68492]: DEBUG nova.network.neutron [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Updated VIF entry in instance network info cache for port 7031d064-0372-4f2d-bc3e-47903b00cb6e. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1684.870726] env[68492]: DEBUG nova.network.neutron [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Updating instance_info_cache with network_info: [{"id": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "address": "fa:16:3e:21:b1:3b", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7031d064-03", "ovs_interfaceid": "7031d064-0372-4f2d-bc3e-47903b00cb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1684.879933] env[68492]: DEBUG oslo_concurrency.lockutils [req-1945eb66-14f5-48b0-8bca-fc916b1ec6d6 req-7d49b651-d524-41be-9390-920f568740a4 service nova] Releasing lock "refresh_cache-aab8759d-db1e-4817-98bf-e1fb45e75640" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1685.110727] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395538, 'name': CreateVM_Task, 'duration_secs': 0.296747} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1685.110900] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1685.111543] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1685.111711] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1685.112042] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1685.112300] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec9f5dc6-5102-44ac-9165-857d183afbbe {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.116738] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1685.116738] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528b10dc-3b0b-5a20-036c-a0ef9ed3a90e" [ 1685.116738] env[68492]: _type = "Task" [ 1685.116738] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1685.124757] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528b10dc-3b0b-5a20-036c-a0ef9ed3a90e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1685.628027] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1685.628027] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1685.628475] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1724.231886] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1726.231590] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1727.228046] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1727.249976] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1727.250350] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1727.250350] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1727.269534] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.269708] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.269847] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.269971] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270106] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270225] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270339] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270669] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270669] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270669] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1727.270850] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1729.232104] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1729.244283] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1729.244509] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1729.244674] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1729.244830] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1729.245942] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85d750d0-8bb2-4d50-aacc-f11ba6bc0c43 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.254422] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c291f3c6-bbad-4302-96a0-ab0de8098587 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.269244] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf59868-7bb5-4c97-a18d-9abac1a37d3c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.275392] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8050629-4060-44fc-90c0-efa28ed756a1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.304092] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1729.304092] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1729.304297] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1729.454829] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455011] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455157] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455327] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455573] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455704] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.455825] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.456256] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.456384] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.456547] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1729.473739] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1729.486058] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1729.496706] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1729.496943] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1729.497115] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1729.516589] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1729.534046] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1729.534298] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1729.545465] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1729.565964] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1729.734053] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3839eca6-aaf0-418b-a0e0-2b931b50c4b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.740567] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-639f7276-c957-44f2-a343-0938af9daf3e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.777612] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42cf181c-d53f-44ce-a1b0-74ca7f38a13b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.785419] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e70f87d-f989-4920-8842-b42f06a954cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1729.798696] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1729.806816] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1729.823541] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1729.823721] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.519s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1730.735187] env[68492]: WARNING oslo_vmware.rw_handles [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1730.735187] env[68492]: ERROR oslo_vmware.rw_handles [ 1730.735896] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1730.737773] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1730.738059] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Copying Virtual Disk [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/63900642-b4a8-4207-9ddd-19c633f934f7/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1730.738419] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9ff9dc81-5e59-40d8-898f-8f04cf103f6b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1730.746933] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for the task: (returnval){ [ 1730.746933] env[68492]: value = "task-3395539" [ 1730.746933] env[68492]: _type = "Task" [ 1730.746933] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1730.754767] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Task: {'id': task-3395539, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1730.822435] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1730.822658] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1730.822808] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1731.261028] env[68492]: DEBUG oslo_vmware.exceptions [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1731.261028] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1731.261028] env[68492]: ERROR nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1731.261028] env[68492]: Faults: ['InvalidArgument'] [ 1731.261028] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Traceback (most recent call last): [ 1731.261028] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1731.261028] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] yield resources [ 1731.261028] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1731.261028] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self.driver.spawn(context, instance, image_meta, [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self._fetch_image_if_missing(context, vi) [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] image_cache(vi, tmp_image_ds_loc) [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] vm_util.copy_virtual_disk( [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] session._wait_for_task(vmdk_copy_task) [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return self.wait_for_task(task_ref) [ 1731.261507] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return evt.wait() [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] result = hub.switch() [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return self.greenlet.switch() [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self.f(*self.args, **self.kw) [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] raise exceptions.translate_fault(task_info.error) [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Faults: ['InvalidArgument'] [ 1731.261954] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] [ 1731.262368] env[68492]: INFO nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Terminating instance [ 1731.262368] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1731.262368] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1731.263137] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1731.263482] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1731.263833] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf009218-b548-4093-b37c-c6727bb5df19 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.266374] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf50265-c222-4be4-99c7-b95910719f9e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.275027] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1731.275027] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5a4ef40c-7e94-449b-8d1e-8fdae3b1a118 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.278081] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1731.278845] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1731.279610] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b6e9810-34de-43ba-bba2-37428fccc2c4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.284570] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for the task: (returnval){ [ 1731.284570] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52dcb31e-ff3a-18d4-0aaf-b354e050bb2a" [ 1731.284570] env[68492]: _type = "Task" [ 1731.284570] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1731.293137] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52dcb31e-ff3a-18d4-0aaf-b354e050bb2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1731.346148] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1731.346413] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1731.346680] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Deleting the datastore file [datastore2] 29397c54-4bb2-4b43-afcb-9969d8dec996 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1731.346993] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dbd5f4a7-93e1-47c6-b3c1-d48adcedb9ce {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.354739] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for the task: (returnval){ [ 1731.354739] env[68492]: value = "task-3395541" [ 1731.354739] env[68492]: _type = "Task" [ 1731.354739] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1731.363018] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Task: {'id': task-3395541, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1731.795068] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1731.795522] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Creating directory with path [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1731.795576] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c29e5d33-9d4d-4464-a703-a21aab11838c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.807331] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Created directory with path [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1731.807511] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Fetch image to [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1731.807688] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1731.808454] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fd0faf6-6599-4e14-86af-f566d7b816a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.814886] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c4b2dcf-62b8-4c59-9e63-2a0c49324e67 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.823713] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ded188c-3d50-4de3-bdec-bdf04ea6eef8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.854890] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9d428c9-33c3-484a-834e-638bb87a3fae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.865919] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c5ac7d08-aac9-4f22-bc94-e367ab55147e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1731.867637] env[68492]: DEBUG oslo_vmware.api [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Task: {'id': task-3395541, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.122571} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1731.867872] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1731.868065] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1731.868243] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1731.868409] env[68492]: INFO nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1731.870662] env[68492]: DEBUG nova.compute.claims [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1731.870854] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1731.871089] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1731.893054] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1732.058557] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1732.124615] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1732.124615] env[68492]: DEBUG oslo_vmware.rw_handles [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1732.145968] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22963e2b-5dba-4e18-b78e-02ef0cb08ec5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.154049] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-196ae972-0eab-41ee-9ccb-7780f4ecdb2f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.185770] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1408ae1-9ebd-4f69-86e9-b1eae0b87eef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.193334] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b601459d-3ce1-4ec6-9de6-3e842b0c68d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.207294] env[68492]: DEBUG nova.compute.provider_tree [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1732.215715] env[68492]: DEBUG nova.scheduler.client.report [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1732.225778] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.229587] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.358s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1732.230165] env[68492]: ERROR nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1732.230165] env[68492]: Faults: ['InvalidArgument'] [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Traceback (most recent call last): [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self.driver.spawn(context, instance, image_meta, [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self._fetch_image_if_missing(context, vi) [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] image_cache(vi, tmp_image_ds_loc) [ 1732.230165] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] vm_util.copy_virtual_disk( [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] session._wait_for_task(vmdk_copy_task) [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return self.wait_for_task(task_ref) [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return evt.wait() [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] result = hub.switch() [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] return self.greenlet.switch() [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1732.230871] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] self.f(*self.args, **self.kw) [ 1732.231546] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1732.231546] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] raise exceptions.translate_fault(task_info.error) [ 1732.231546] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1732.231546] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Faults: ['InvalidArgument'] [ 1732.231546] env[68492]: ERROR nova.compute.manager [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] [ 1732.231546] env[68492]: DEBUG nova.compute.utils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1732.232382] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.232677] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Build of instance 29397c54-4bb2-4b43-afcb-9969d8dec996 was re-scheduled: A specified parameter was not correct: fileType [ 1732.232677] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1732.233085] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1732.233285] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1732.233458] env[68492]: DEBUG nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1732.233621] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1732.236296] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.736075] env[68492]: DEBUG nova.network.neutron [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1732.749794] env[68492]: INFO nova.compute.manager [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Took 0.52 seconds to deallocate network for instance. [ 1732.860454] env[68492]: INFO nova.scheduler.client.report [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Deleted allocations for instance 29397c54-4bb2-4b43-afcb-9969d8dec996 [ 1732.883021] env[68492]: DEBUG oslo_concurrency.lockutils [None req-92113df8-7b4e-452a-89a2-efb8e5710bb2 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 677.070s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1732.883197] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 480.808s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1732.883427] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Acquiring lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1732.883629] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1732.883788] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1732.886142] env[68492]: INFO nova.compute.manager [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Terminating instance [ 1732.887903] env[68492]: DEBUG nova.compute.manager [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1732.888110] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1732.888827] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7ac28dff-cef3-4e30-ad35-2eb10e05f854 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.896428] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1732.902261] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f602b355-0cc2-4f98-93ec-62d53b2c2c71 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1732.931955] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 29397c54-4bb2-4b43-afcb-9969d8dec996 could not be found. [ 1732.932177] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1732.932354] env[68492]: INFO nova.compute.manager [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1732.932601] env[68492]: DEBUG oslo.service.loopingcall [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1732.932820] env[68492]: DEBUG nova.compute.manager [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1732.932915] env[68492]: DEBUG nova.network.neutron [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1732.953861] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1732.954125] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1732.955573] env[68492]: INFO nova.compute.claims [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1732.958961] env[68492]: DEBUG nova.network.neutron [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1732.966874] env[68492]: INFO nova.compute.manager [-] [instance: 29397c54-4bb2-4b43-afcb-9969d8dec996] Took 0.03 seconds to deallocate network for instance. [ 1733.073446] env[68492]: DEBUG oslo_concurrency.lockutils [None req-2d24caa8-2918-4e91-8d27-69eec4fd38e6 tempest-ServersTestManualDisk-684726871 tempest-ServersTestManualDisk-684726871-project-member] Lock "29397c54-4bb2-4b43-afcb-9969d8dec996" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1733.177299] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8a788ea-aacf-432e-8c5a-7f1c8d0df0e9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.184984] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef384b0-d0d3-4778-bb86-6fda4406faaa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.214581] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d743a5c3-8694-4e83-9cb0-5207789959d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.222177] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b313dcc-4876-4214-98b4-f518317d0b66 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.235167] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.236807] env[68492]: DEBUG nova.compute.provider_tree [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1733.247018] env[68492]: DEBUG nova.scheduler.client.report [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1733.261813] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1733.262478] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1733.299826] env[68492]: DEBUG nova.compute.utils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1733.301140] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1733.301303] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1733.310570] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1733.374236] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1733.401856] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1733.402129] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1733.402287] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1733.402466] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1733.402608] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1733.402757] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1733.402953] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1733.403163] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1733.403337] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1733.403498] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1733.403666] env[68492]: DEBUG nova.virt.hardware [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1733.404537] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38181287-e410-40a6-af9c-4f001da3f375 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.412999] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-029c3be3-f3a0-4aa1-acce-b8b4b7179897 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.433173] env[68492]: DEBUG nova.policy [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '568ab24cbb7d4833bb8cdfd51db89db5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80fa34aee50b4509a18abca39075924a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1733.912795] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Successfully created port: dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1734.239036] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.239215] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 1734.255198] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 1 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 1734.255371] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 40087617-1982-4727-ac78-1cb6437b11c9] Instance has had 0 of 5 cleanup attempts {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11217}} [ 1734.462764] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Successfully updated port: dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1734.476222] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1734.476751] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1734.476751] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1734.567213] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1735.114641] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Updating instance_info_cache with network_info: [{"id": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "address": "fa:16:3e:5d:4f:7f", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc69ac31-55", "ovs_interfaceid": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1735.126468] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1735.126801] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance network_info: |[{"id": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "address": "fa:16:3e:5d:4f:7f", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc69ac31-55", "ovs_interfaceid": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1735.127210] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5d:4f:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35342bcb-8b06-472e-b3c0-43fd3d6c4b30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dc69ac31-55ee-438f-a8a5-8e3092ed05f9', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1735.134873] env[68492]: DEBUG oslo.service.loopingcall [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1735.135372] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1735.135606] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-14c454ec-4c9b-4a28-b43c-1b3fceda228c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1735.155785] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1735.155785] env[68492]: value = "task-3395542" [ 1735.155785] env[68492]: _type = "Task" [ 1735.155785] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1735.163049] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395542, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1735.230785] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1735.230993] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 1735.305743] env[68492]: DEBUG nova.compute.manager [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Received event network-vif-plugged-dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1735.306095] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Acquiring lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1735.306392] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1735.306659] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1735.306903] env[68492]: DEBUG nova.compute.manager [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] No waiting events found dispatching network-vif-plugged-dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1735.307306] env[68492]: WARNING nova.compute.manager [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Received unexpected event network-vif-plugged-dc69ac31-55ee-438f-a8a5-8e3092ed05f9 for instance with vm_state building and task_state spawning. [ 1735.307553] env[68492]: DEBUG nova.compute.manager [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Received event network-changed-dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1735.307789] env[68492]: DEBUG nova.compute.manager [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Refreshing instance network info cache due to event network-changed-dc69ac31-55ee-438f-a8a5-8e3092ed05f9. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1735.308042] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Acquiring lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1735.308241] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Acquired lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1735.308454] env[68492]: DEBUG nova.network.neutron [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Refreshing network info cache for port dc69ac31-55ee-438f-a8a5-8e3092ed05f9 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1735.574271] env[68492]: DEBUG nova.network.neutron [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Updated VIF entry in instance network info cache for port dc69ac31-55ee-438f-a8a5-8e3092ed05f9. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1735.574618] env[68492]: DEBUG nova.network.neutron [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Updating instance_info_cache with network_info: [{"id": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "address": "fa:16:3e:5d:4f:7f", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc69ac31-55", "ovs_interfaceid": "dc69ac31-55ee-438f-a8a5-8e3092ed05f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1735.583557] env[68492]: DEBUG oslo_concurrency.lockutils [req-fa76413f-eec4-4034-bf79-a47a74503561 req-f42793fd-a05d-4c14-ab3e-006960641796 service nova] Releasing lock "refresh_cache-e6c9ab71-8507-4238-9936-fd9a61101313" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1735.665507] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395542, 'name': CreateVM_Task, 'duration_secs': 0.406455} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1735.665680] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1735.666378] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1735.666569] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1735.666898] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1735.667165] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cf976f6-28b0-4ec2-8f5b-764f75d3a02e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1735.671873] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 1735.671873] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]526e8ef4-43ba-5f2c-b39e-6358d9e9a54f" [ 1735.671873] env[68492]: _type = "Task" [ 1735.671873] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1735.680721] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]526e8ef4-43ba-5f2c-b39e-6358d9e9a54f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1736.182953] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1736.182953] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1736.182953] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1768.203650] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1768.225559] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 1768.225559] env[68492]: value = "domain-c8" [ 1768.225559] env[68492]: _type = "ClusterComputeResource" [ 1768.225559] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1768.227231] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01766154-bcfd-4116-882a-9739efc142f7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.243875] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 10 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1768.244058] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 29bd5cc4-d884-4202-b503-74920a0b4ec5 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.244252] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 4a7172f0-050f-4040-b974-91ce9ac96a0d {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.244413] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.244567] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 03afef99-e2dd-4467-8426-fbe50481aa6f {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.244717] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid b0757e62-96ca-4758-8444-dcc98fbf0a29 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.244866] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.245034] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 18e27433-5b1f-4ae8-8bfc-a232966de70b {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.245189] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid a90e989d-6aef-482f-b767-8dbdd7f29628 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.245337] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid aab8759d-db1e-4817-98bf-e1fb45e75640 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.245480] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid e6c9ab71-8507-4238-9936-fd9a61101313 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 1768.245798] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.246057] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.246259] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.246453] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "03afef99-e2dd-4467-8426-fbe50481aa6f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.246645] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.246836] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.247034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.247230] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "a90e989d-6aef-482f-b767-8dbdd7f29628" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.247590] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "aab8759d-db1e-4817-98bf-e1fb45e75640" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.247668] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "e6c9ab71-8507-4238-9936-fd9a61101313" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1781.411367] env[68492]: WARNING oslo_vmware.rw_handles [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1781.411367] env[68492]: ERROR oslo_vmware.rw_handles [ 1781.412263] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1781.413742] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1781.413988] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Copying Virtual Disk [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/e370881c-83ed-43f8-a27c-8e033d7ddec5/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1781.414281] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a2391e20-38d1-4d47-b2d7-c7a6f1292261 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.423460] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for the task: (returnval){ [ 1781.423460] env[68492]: value = "task-3395543" [ 1781.423460] env[68492]: _type = "Task" [ 1781.423460] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1781.431343] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Task: {'id': task-3395543, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1781.933906] env[68492]: DEBUG oslo_vmware.exceptions [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1781.934191] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1781.934778] env[68492]: ERROR nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1781.934778] env[68492]: Faults: ['InvalidArgument'] [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Traceback (most recent call last): [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] yield resources [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self.driver.spawn(context, instance, image_meta, [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self._fetch_image_if_missing(context, vi) [ 1781.934778] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] image_cache(vi, tmp_image_ds_loc) [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] vm_util.copy_virtual_disk( [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] session._wait_for_task(vmdk_copy_task) [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return self.wait_for_task(task_ref) [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return evt.wait() [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] result = hub.switch() [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1781.935216] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return self.greenlet.switch() [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self.f(*self.args, **self.kw) [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] raise exceptions.translate_fault(task_info.error) [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Faults: ['InvalidArgument'] [ 1781.935653] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] [ 1781.935653] env[68492]: INFO nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Terminating instance [ 1781.936641] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1781.936847] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1781.937119] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e01d9639-32f8-4833-bddf-0404c93634ea {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.939252] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1781.939438] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1781.940160] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd1a7961-03cc-47a4-8a6a-3b1fbc72bf21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.946799] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1781.947040] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4fbc2853-b5b5-4c04-9e99-f8f4b1c7b641 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.949064] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1781.949237] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1781.950153] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00f9841e-20a7-489a-9b1d-69fca7ed5beb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1781.954758] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1781.954758] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c16ddb-09d0-2523-3005-b36af070698c" [ 1781.954758] env[68492]: _type = "Task" [ 1781.954758] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1781.966295] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c16ddb-09d0-2523-3005-b36af070698c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1782.014696] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1782.014912] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1782.015107] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Deleting the datastore file [datastore2] 29bd5cc4-d884-4202-b503-74920a0b4ec5 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1782.015368] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-664947c5-8df3-4881-bae3-ff3dd835d3e3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.021074] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for the task: (returnval){ [ 1782.021074] env[68492]: value = "task-3395545" [ 1782.021074] env[68492]: _type = "Task" [ 1782.021074] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1782.029570] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Task: {'id': task-3395545, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1782.464986] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1782.465359] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating directory with path [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1782.465516] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-156d0c7f-37ad-4630-bb46-b0961f5febad {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.476018] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created directory with path [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1782.476226] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Fetch image to [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1782.476391] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1782.477084] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8897395-b67a-4188-8c14-99e72c82b375 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.483197] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a575447c-9fb0-4579-908d-b64cdd1c73c8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.491943] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c2e31d8-c47d-407a-86f6-60426b7d907b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.524507] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa0a1d1d-5b3d-4749-a582-d3f83d8d2b09 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.530949] env[68492]: DEBUG oslo_vmware.api [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Task: {'id': task-3395545, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073012} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1782.532367] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1782.532553] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1782.532721] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1782.532888] env[68492]: INFO nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1782.534849] env[68492]: DEBUG nova.compute.claims [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1782.535025] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1782.535254] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1782.537598] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea4b9ea3-cd70-4799-9547-417139373234 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.563235] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1782.613882] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1782.674168] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1782.674362] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1782.782457] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1050ff47-f170-442d-8f20-0aa2f2770b53 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.790180] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba732f16-ec30-4bb5-8079-fbafa74775ac {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.820508] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c7bf937-56cb-456b-89d6-9cad31dca594 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.828904] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b033aa6-de6a-44d9-9017-39470eb13b3f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1782.841215] env[68492]: DEBUG nova.compute.provider_tree [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1782.849832] env[68492]: DEBUG nova.scheduler.client.report [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1782.863052] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1782.863557] env[68492]: ERROR nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1782.863557] env[68492]: Faults: ['InvalidArgument'] [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Traceback (most recent call last): [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self.driver.spawn(context, instance, image_meta, [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self._fetch_image_if_missing(context, vi) [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] image_cache(vi, tmp_image_ds_loc) [ 1782.863557] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] vm_util.copy_virtual_disk( [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] session._wait_for_task(vmdk_copy_task) [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return self.wait_for_task(task_ref) [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return evt.wait() [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] result = hub.switch() [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] return self.greenlet.switch() [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1782.863976] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] self.f(*self.args, **self.kw) [ 1782.864597] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1782.864597] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] raise exceptions.translate_fault(task_info.error) [ 1782.864597] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1782.864597] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Faults: ['InvalidArgument'] [ 1782.864597] env[68492]: ERROR nova.compute.manager [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] [ 1782.864597] env[68492]: DEBUG nova.compute.utils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1782.865637] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Build of instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 was re-scheduled: A specified parameter was not correct: fileType [ 1782.865637] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1782.866020] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1782.866194] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1782.866424] env[68492]: DEBUG nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1782.866581] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1782.936887] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "aab8759d-db1e-4817-98bf-e1fb45e75640" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.294375] env[68492]: DEBUG nova.network.neutron [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1783.304729] env[68492]: INFO nova.compute.manager [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Took 0.44 seconds to deallocate network for instance. [ 1783.398712] env[68492]: INFO nova.scheduler.client.report [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Deleted allocations for instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 [ 1783.417527] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f58e6e60-c47b-4069-b8e5-dd6d394af902 tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 678.943s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.418634] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 483.169s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1783.418853] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Acquiring lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.419094] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1783.419271] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.422139] env[68492]: INFO nova.compute.manager [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Terminating instance [ 1783.422814] env[68492]: DEBUG nova.compute.manager [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1783.423013] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1783.423468] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ef9a8ec3-101d-4b3a-9f6e-6b8d5c77debf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.433762] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05f10618-565a-49ef-9ea0-3663fed2e615 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.445636] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1783.466243] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 29bd5cc4-d884-4202-b503-74920a0b4ec5 could not be found. [ 1783.466493] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1783.466630] env[68492]: INFO nova.compute.manager [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1783.466888] env[68492]: DEBUG oslo.service.loopingcall [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1783.467169] env[68492]: DEBUG nova.compute.manager [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1783.467274] env[68492]: DEBUG nova.network.neutron [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1783.490424] env[68492]: DEBUG nova.network.neutron [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1783.492876] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1783.493114] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1783.494486] env[68492]: INFO nova.compute.claims [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1783.500894] env[68492]: INFO nova.compute.manager [-] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] Took 0.03 seconds to deallocate network for instance. [ 1783.586667] env[68492]: DEBUG oslo_concurrency.lockutils [None req-46255831-feea-4463-bcff-9c89b82b6beb tempest-ServerTagsTestJSON-615346621 tempest-ServerTagsTestJSON-615346621-project-member] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.588059] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 15.342s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1783.588193] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 29bd5cc4-d884-4202-b503-74920a0b4ec5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1783.588387] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "29bd5cc4-d884-4202-b503-74920a0b4ec5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.669790] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f579650c-e7a3-4bb2-bc24-05c4d99f3a22 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.677575] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-814a927c-efa9-480e-a668-0609834bfcd9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.706092] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b5c4431-b0e8-4115-a1d0-41a5837009ed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.712520] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56e25288-f413-4e42-ae3b-f8eb079a6424 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.725889] env[68492]: DEBUG nova.compute.provider_tree [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1783.735492] env[68492]: DEBUG nova.scheduler.client.report [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1783.747537] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1783.747944] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1783.784875] env[68492]: DEBUG nova.compute.utils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1783.786556] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1783.786749] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1783.795040] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1783.853335] env[68492]: DEBUG nova.policy [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0632cf422c924ac5a572e94eabb9474d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b52fd4757d840e1b1fff8ffc0b5e273', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1783.863633] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1783.888024] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1783.888024] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1783.888252] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1783.888351] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1783.888498] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1783.888643] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1783.888847] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1783.889074] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1783.889255] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1783.889415] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1783.889792] env[68492]: DEBUG nova.virt.hardware [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1783.890446] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25914058-b2a9-4b81-90c4-aaee0679f8dc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1783.898114] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b29ba42-f56a-460c-ac90-fbf2a9184cca {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1784.275707] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1784.339213] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Successfully created port: 1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1784.900190] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Successfully updated port: 1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1784.912548] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1784.913024] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquired lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1784.913024] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1784.951812] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1785.235232] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Updating instance_info_cache with network_info: [{"id": "1801c67c-039c-4c44-960c-dc08c455f6f6", "address": "fa:16:3e:87:ed:07", "network": {"id": "c8af3a1d-55cb-4536-9cd5-7d5754fe007f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1465210694-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b52fd4757d840e1b1fff8ffc0b5e273", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13b62154-a0e1-4eed-bc30-6464b15993bb", "external-id": "nsx-vlan-transportzone-514", "segmentation_id": 514, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1801c67c-03", "ovs_interfaceid": "1801c67c-039c-4c44-960c-dc08c455f6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1785.248098] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Releasing lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1785.248398] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance network_info: |[{"id": "1801c67c-039c-4c44-960c-dc08c455f6f6", "address": "fa:16:3e:87:ed:07", "network": {"id": "c8af3a1d-55cb-4536-9cd5-7d5754fe007f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1465210694-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b52fd4757d840e1b1fff8ffc0b5e273", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13b62154-a0e1-4eed-bc30-6464b15993bb", "external-id": "nsx-vlan-transportzone-514", "segmentation_id": 514, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1801c67c-03", "ovs_interfaceid": "1801c67c-039c-4c44-960c-dc08c455f6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1785.248787] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:ed:07', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13b62154-a0e1-4eed-bc30-6464b15993bb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1801c67c-039c-4c44-960c-dc08c455f6f6', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1785.256893] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Creating folder: Project (5b52fd4757d840e1b1fff8ffc0b5e273). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1785.257839] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fec67630-ae6f-4404-9918-9ca8b49ffa55 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.267783] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Created folder: Project (5b52fd4757d840e1b1fff8ffc0b5e273) in parent group-v677434. [ 1785.268089] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Creating folder: Instances. Parent ref: group-v677545. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1785.268207] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6bf2f9a1-cb25-4767-ac81-c5cfc30ea293 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.276981] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Created folder: Instances in parent group-v677545. [ 1785.277273] env[68492]: DEBUG oslo.service.loopingcall [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1785.277471] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1785.277832] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f24d1ee4-9201-4e58-a56e-42532329bf9f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.298329] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1785.298329] env[68492]: value = "task-3395548" [ 1785.298329] env[68492]: _type = "Task" [ 1785.298329] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1785.305737] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395548, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1785.575544] env[68492]: DEBUG nova.compute.manager [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Received event network-vif-plugged-1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1785.575812] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Acquiring lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1785.578618] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1785.578866] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1785.579120] env[68492]: DEBUG nova.compute.manager [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] No waiting events found dispatching network-vif-plugged-1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1785.579338] env[68492]: WARNING nova.compute.manager [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Received unexpected event network-vif-plugged-1801c67c-039c-4c44-960c-dc08c455f6f6 for instance with vm_state building and task_state spawning. [ 1785.580246] env[68492]: DEBUG nova.compute.manager [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Received event network-changed-1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1785.580246] env[68492]: DEBUG nova.compute.manager [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Refreshing instance network info cache due to event network-changed-1801c67c-039c-4c44-960c-dc08c455f6f6. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1785.580246] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Acquiring lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1785.580246] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Acquired lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1785.580246] env[68492]: DEBUG nova.network.neutron [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Refreshing network info cache for port 1801c67c-039c-4c44-960c-dc08c455f6f6 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1785.809964] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395548, 'name': CreateVM_Task, 'duration_secs': 0.333979} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1785.810181] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1785.810807] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1785.810969] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1785.811312] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1785.811568] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-13c84714-f40e-4914-be37-7d177c32b496 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.815897] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for the task: (returnval){ [ 1785.815897] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e97bf1-e4cb-8aa7-f9ed-19a9a0a65788" [ 1785.815897] env[68492]: _type = "Task" [ 1785.815897] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1785.824046] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e97bf1-e4cb-8aa7-f9ed-19a9a0a65788, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1785.847534] env[68492]: DEBUG nova.network.neutron [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Updated VIF entry in instance network info cache for port 1801c67c-039c-4c44-960c-dc08c455f6f6. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1785.847878] env[68492]: DEBUG nova.network.neutron [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Updating instance_info_cache with network_info: [{"id": "1801c67c-039c-4c44-960c-dc08c455f6f6", "address": "fa:16:3e:87:ed:07", "network": {"id": "c8af3a1d-55cb-4536-9cd5-7d5754fe007f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1465210694-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5b52fd4757d840e1b1fff8ffc0b5e273", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13b62154-a0e1-4eed-bc30-6464b15993bb", "external-id": "nsx-vlan-transportzone-514", "segmentation_id": 514, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1801c67c-03", "ovs_interfaceid": "1801c67c-039c-4c44-960c-dc08c455f6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1785.858189] env[68492]: DEBUG oslo_concurrency.lockutils [req-2c829650-bf84-456d-a3e8-31a8dc275cba req-b7cc5159-fb33-43df-9b8c-e4bce7e670cc service nova] Releasing lock "refresh_cache-610e0ba9-49f1-45b7-9dea-08945d1d56b9" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1786.328602] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1786.328931] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1786.329508] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1787.230594] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1787.232340] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1787.232340] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1787.252648] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.252822] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.252954] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253148] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253317] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253445] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253565] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253682] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253797] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.253913] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1787.254045] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1788.231505] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1788.699728] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "e6c9ab71-8507-4238-9936-fd9a61101313" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1789.230591] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1789.241978] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1789.242309] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1789.242362] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1789.242515] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1789.243608] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc87cbb-06c5-4e23-96be-27572dbdcab6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.252289] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9d3a193-76f8-4833-b036-3e75e7931f43 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.265991] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c02b5831-ec5b-480d-9824-9ef80253686f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.272417] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b13398-1757-413e-9224-ce344bd2f8ef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.302840] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180958MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1789.302996] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1789.303208] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1789.382241] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.382408] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.382538] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.382662] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.382806] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.382977] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.383039] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.383173] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.383329] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.383452] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1789.394512] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1789.394745] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1789.394896] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1789.512870] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1f0e97-7c89-4b00-a311-8eb1e5994dde {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.520370] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de8ad5d1-a6dc-4444-9cb4-292b88ba21a7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.549346] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab1b3ba-8ec0-4324-962b-23a304f79697 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.555776] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6756db99-5bd1-49da-8cd2-327bc6e5d244 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.569246] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1789.577141] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1789.589666] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1789.589834] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1790.591190] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.231451] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.231629] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1792.232025] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1794.226147] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1794.230803] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1826.648656] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1830.771538] env[68492]: WARNING oslo_vmware.rw_handles [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1830.771538] env[68492]: ERROR oslo_vmware.rw_handles [ 1830.772557] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1830.774213] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1830.774461] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Copying Virtual Disk [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/03232ecc-0ae5-4197-8628-52ced7c139c3/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1830.774768] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f368ba3e-02b3-481a-b14f-eecc157495e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.783162] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1830.783162] env[68492]: value = "task-3395549" [ 1830.783162] env[68492]: _type = "Task" [ 1830.783162] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1830.790844] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395549, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.294245] env[68492]: DEBUG oslo_vmware.exceptions [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1831.294505] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1831.295064] env[68492]: ERROR nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1831.295064] env[68492]: Faults: ['InvalidArgument'] [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Traceback (most recent call last): [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] yield resources [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self.driver.spawn(context, instance, image_meta, [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self._fetch_image_if_missing(context, vi) [ 1831.295064] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] image_cache(vi, tmp_image_ds_loc) [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] vm_util.copy_virtual_disk( [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] session._wait_for_task(vmdk_copy_task) [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return self.wait_for_task(task_ref) [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return evt.wait() [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] result = hub.switch() [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1831.295520] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return self.greenlet.switch() [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self.f(*self.args, **self.kw) [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] raise exceptions.translate_fault(task_info.error) [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Faults: ['InvalidArgument'] [ 1831.295920] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] [ 1831.295920] env[68492]: INFO nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Terminating instance [ 1831.296881] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1831.297093] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1831.297386] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4fb40f0e-5de6-4289-ac19-17ff7c144e6e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.299541] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1831.299733] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1831.300492] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-960592e1-d5b5-4d18-babf-683ac5814adb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.307208] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1831.307425] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4ea70173-e274-4203-ae67-456af1c5f992 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.309555] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1831.309722] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1831.310675] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f1a6e028-1ab9-4024-ba11-d4db4bbfb365 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.315048] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 1831.315048] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d3a030-e0b8-5e0b-90e8-6369594baf88" [ 1831.315048] env[68492]: _type = "Task" [ 1831.315048] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1831.322469] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d3a030-e0b8-5e0b-90e8-6369594baf88, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.378020] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1831.378254] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1831.378441] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleting the datastore file [datastore2] 4a7172f0-050f-4040-b974-91ce9ac96a0d {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1831.378714] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1fa3f103-85f1-4596-98c8-be3c141ec7cc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.384836] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1831.384836] env[68492]: value = "task-3395551" [ 1831.384836] env[68492]: _type = "Task" [ 1831.384836] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1831.392324] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395551, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1831.825629] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1831.826082] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating directory with path [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1831.826201] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b465e69b-38d6-4188-aa6b-43d3ee0ec455 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.837114] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created directory with path [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1831.837297] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Fetch image to [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1831.837468] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1831.838180] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd7dc0c9-f0b6-41b3-adbd-bfa48d0127e3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.844198] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a21c49c-c502-40b9-81e6-1ec0845e7d75 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.853765] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac6ec60-f9d4-45c5-af5e-3a33f959a43e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.882939] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98e7b332-cfae-45f5-9c11-a22ea5a5e292 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.889608] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-07352f15-9474-4f4d-874c-19679c29bc60 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1831.893648] env[68492]: DEBUG oslo_vmware.api [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395551, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073156} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1831.894183] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1831.894373] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1831.894542] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1831.894771] env[68492]: INFO nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1831.896719] env[68492]: DEBUG nova.compute.claims [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1831.896878] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1831.897106] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1831.912738] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1831.962161] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1832.022302] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1832.022493] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1832.129347] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11858f2f-3845-4234-9e60-00e9a600834b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.136920] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce755a1b-181e-4e37-b750-fe3f593a7dbd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.170098] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9cd8cb2-6954-42ce-97c8-da98dac3c25e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.177139] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e95a580-8ddd-4bc4-a473-9c91c9d08163 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.190036] env[68492]: DEBUG nova.compute.provider_tree [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1832.198010] env[68492]: DEBUG nova.scheduler.client.report [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1832.214060] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.214599] env[68492]: ERROR nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1832.214599] env[68492]: Faults: ['InvalidArgument'] [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Traceback (most recent call last): [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self.driver.spawn(context, instance, image_meta, [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self._fetch_image_if_missing(context, vi) [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] image_cache(vi, tmp_image_ds_loc) [ 1832.214599] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] vm_util.copy_virtual_disk( [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] session._wait_for_task(vmdk_copy_task) [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return self.wait_for_task(task_ref) [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return evt.wait() [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] result = hub.switch() [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] return self.greenlet.switch() [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1832.215199] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] self.f(*self.args, **self.kw) [ 1832.215773] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1832.215773] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] raise exceptions.translate_fault(task_info.error) [ 1832.215773] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1832.215773] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Faults: ['InvalidArgument'] [ 1832.215773] env[68492]: ERROR nova.compute.manager [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] [ 1832.215773] env[68492]: DEBUG nova.compute.utils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1832.216616] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Build of instance 4a7172f0-050f-4040-b974-91ce9ac96a0d was re-scheduled: A specified parameter was not correct: fileType [ 1832.216616] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1832.216981] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1832.217382] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1832.217667] env[68492]: DEBUG nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1832.217755] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1832.518629] env[68492]: DEBUG nova.network.neutron [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1832.530334] env[68492]: INFO nova.compute.manager [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Took 0.31 seconds to deallocate network for instance. [ 1832.622458] env[68492]: INFO nova.scheduler.client.report [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleted allocations for instance 4a7172f0-050f-4040-b974-91ce9ac96a0d [ 1832.642853] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d6db8c2d-a3c1-4b9e-b567-e48faba478d2 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 638.796s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.644016] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 442.763s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1832.644268] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1832.644481] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1832.645714] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.646560] env[68492]: INFO nova.compute.manager [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Terminating instance [ 1832.648923] env[68492]: DEBUG nova.compute.manager [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1832.649193] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1832.649693] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-99104086-aeea-4aea-96f8-70d8fa1fb6b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.654450] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1832.662026] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aba39df1-919c-4474-8e70-168da4f46596 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.690848] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a7172f0-050f-4040-b974-91ce9ac96a0d could not be found. [ 1832.691066] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1832.691252] env[68492]: INFO nova.compute.manager [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1832.691495] env[68492]: DEBUG oslo.service.loopingcall [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1832.693738] env[68492]: DEBUG nova.compute.manager [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1832.693842] env[68492]: DEBUG nova.network.neutron [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1832.707932] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1832.708181] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1832.709663] env[68492]: INFO nova.compute.claims [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1832.718586] env[68492]: DEBUG nova.network.neutron [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1832.729252] env[68492]: INFO nova.compute.manager [-] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] Took 0.04 seconds to deallocate network for instance. [ 1832.837280] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c7644058-aee4-4175-b49e-e06967a4e617 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.838116] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 64.592s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1832.838306] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 4a7172f0-050f-4040-b974-91ce9ac96a0d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1832.838477] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "4a7172f0-050f-4040-b974-91ce9ac96a0d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.880732] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e31c857-d788-439e-9993-0c9c6e5531cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.888420] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45d648ae-1635-4eb3-8bb8-399995b42ac1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.917969] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f513ae-2efc-4444-bae1-2875538214b1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.924643] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fbd87ad-044f-4ab4-a987-61c3afd7240f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1832.937186] env[68492]: DEBUG nova.compute.provider_tree [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1832.946275] env[68492]: DEBUG nova.scheduler.client.report [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1832.959205] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1832.959700] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1832.991278] env[68492]: DEBUG nova.compute.utils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1832.992641] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1832.992812] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1833.000119] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1833.058338] env[68492]: DEBUG nova.policy [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06d98ba654414d2091d24b5304834776', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbfde028d2494faca2e128b80c7c6a0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1833.062295] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1833.086556] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1833.086800] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1833.086957] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1833.087171] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1833.087315] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1833.087473] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1833.087673] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1833.087832] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1833.087994] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1833.088165] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1833.088333] env[68492]: DEBUG nova.virt.hardware [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1833.089201] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c0c4142-03b6-47de-93fb-87a398b93031 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1833.098532] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5409808-a481-421c-85fd-874b5019b156 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1833.785204] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Successfully created port: 21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1834.379593] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Successfully updated port: 21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1834.394400] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1834.394557] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1834.394725] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1834.431134] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1834.556084] env[68492]: DEBUG nova.compute.manager [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Received event network-vif-plugged-21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1834.556361] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Acquiring lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1834.556505] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1834.556671] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1834.556833] env[68492]: DEBUG nova.compute.manager [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] No waiting events found dispatching network-vif-plugged-21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1834.556992] env[68492]: WARNING nova.compute.manager [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Received unexpected event network-vif-plugged-21318102-60b6-4414-adb4-37b3496c9fad for instance with vm_state building and task_state spawning. [ 1834.562915] env[68492]: DEBUG nova.compute.manager [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Received event network-changed-21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1834.562915] env[68492]: DEBUG nova.compute.manager [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Refreshing instance network info cache due to event network-changed-21318102-60b6-4414-adb4-37b3496c9fad. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1834.562915] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Acquiring lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1834.694697] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Updating instance_info_cache with network_info: [{"id": "21318102-60b6-4414-adb4-37b3496c9fad", "address": "fa:16:3e:c2:53:a1", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21318102-60", "ovs_interfaceid": "21318102-60b6-4414-adb4-37b3496c9fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1834.708021] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1834.708021] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance network_info: |[{"id": "21318102-60b6-4414-adb4-37b3496c9fad", "address": "fa:16:3e:c2:53:a1", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21318102-60", "ovs_interfaceid": "21318102-60b6-4414-adb4-37b3496c9fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1834.708240] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Acquired lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1834.709266] env[68492]: DEBUG nova.network.neutron [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Refreshing network info cache for port 21318102-60b6-4414-adb4-37b3496c9fad {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1834.709415] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c2:53:a1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cd098b1c-636f-492d-b5ae-037cb0cae454', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '21318102-60b6-4414-adb4-37b3496c9fad', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1834.718334] env[68492]: DEBUG oslo.service.loopingcall [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1834.721737] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1834.722200] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ee00d31-18cf-433b-b97c-d655447319ce {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1834.746379] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1834.746379] env[68492]: value = "task-3395552" [ 1834.746379] env[68492]: _type = "Task" [ 1834.746379] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1834.755112] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395552, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1835.082569] env[68492]: DEBUG nova.network.neutron [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Updated VIF entry in instance network info cache for port 21318102-60b6-4414-adb4-37b3496c9fad. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1835.082910] env[68492]: DEBUG nova.network.neutron [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Updating instance_info_cache with network_info: [{"id": "21318102-60b6-4414-adb4-37b3496c9fad", "address": "fa:16:3e:c2:53:a1", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21318102-60", "ovs_interfaceid": "21318102-60b6-4414-adb4-37b3496c9fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1835.092099] env[68492]: DEBUG oslo_concurrency.lockutils [req-4b4a0725-f359-4441-9b40-509ab48314ea req-3b7c1420-abb0-4601-8cf3-882753c68115 service nova] Releasing lock "refresh_cache-a9111481-6ba1-4d76-bce9-8db609eb704d" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1835.256293] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395552, 'name': CreateVM_Task, 'duration_secs': 0.273739} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1835.256458] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1835.263266] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1835.263432] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1835.263757] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1835.264025] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-41759b0b-8081-4b5c-ac76-a56b3f9cc5cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1835.268458] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 1835.268458] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c523ec-3c12-5ee6-8095-05770dafa027" [ 1835.268458] env[68492]: _type = "Task" [ 1835.268458] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1835.276056] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c523ec-3c12-5ee6-8095-05770dafa027, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1835.778696] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1835.779015] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1835.779183] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1835.982124] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "a9111481-6ba1-4d76-bce9-8db609eb704d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.231455] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1848.231579] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1848.231898] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1848.231898] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1848.252451] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.252635] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.252777] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.252905] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253234] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253234] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253366] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253411] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253523] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253638] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1848.253756] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1850.231340] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1850.231629] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1850.243050] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1850.243290] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1850.243460] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1850.243616] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1850.244849] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd17e9b-b3c3-49d8-8e05-855c4186e262 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.253615] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d2b40a-62d6-41be-bac0-b4cca0e74b9f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.267431] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb04880-0b4e-4608-8859-ec598ed6209b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.273657] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a241c2-fc85-453f-b462-2e235969192f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.303390] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180964MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1850.303584] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1850.303712] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1850.377564] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.377745] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.377874] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378015] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378241] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378387] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378508] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378623] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378736] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.378850] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1850.379057] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1850.379196] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1850.499057] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-846e0975-e8b2-49cf-a1e9-a1a37019e743 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.507327] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb54724-ec0a-4ffb-b86c-7421c5834ae8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.540631] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d884d5e-7195-496a-9009-4005305ef285 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.549660] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6a4b55-bfb7-4e57-8cf0-dc8a5873ec3d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1850.563465] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1850.571688] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1850.586519] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1850.586709] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1852.581987] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.610153] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.610359] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.610537] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.610689] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1854.232311] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.232311] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1855.571453] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "c472a34d-b388-46c9-a7e0-7106b0666478" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.571777] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1855.595688] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.595915] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1881.446618] env[68492]: WARNING oslo_vmware.rw_handles [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1881.446618] env[68492]: ERROR oslo_vmware.rw_handles [ 1881.447640] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1881.449067] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1881.449332] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Copying Virtual Disk [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/81c191f8-4afa-46ac-88cb-91f7246cdef4/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1881.449649] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b6ef3919-0ff5-4a53-8b60-c842217f2471 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.457992] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 1881.457992] env[68492]: value = "task-3395553" [ 1881.457992] env[68492]: _type = "Task" [ 1881.457992] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1881.465899] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395553, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1881.968925] env[68492]: DEBUG oslo_vmware.exceptions [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1881.969234] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1881.969859] env[68492]: ERROR nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1881.969859] env[68492]: Faults: ['InvalidArgument'] [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Traceback (most recent call last): [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] yield resources [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self.driver.spawn(context, instance, image_meta, [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self._fetch_image_if_missing(context, vi) [ 1881.969859] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] image_cache(vi, tmp_image_ds_loc) [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] vm_util.copy_virtual_disk( [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] session._wait_for_task(vmdk_copy_task) [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return self.wait_for_task(task_ref) [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return evt.wait() [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] result = hub.switch() [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1881.970305] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return self.greenlet.switch() [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self.f(*self.args, **self.kw) [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] raise exceptions.translate_fault(task_info.error) [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Faults: ['InvalidArgument'] [ 1881.970757] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] [ 1881.970757] env[68492]: INFO nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Terminating instance [ 1881.972417] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1881.972699] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1881.973028] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-50e60f39-17bc-4834-aceb-c0959b473fed {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.975982] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1881.976257] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1881.977307] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1c77828-347a-41fb-83f5-6fb4187656aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.986163] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1881.987487] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-775fa01d-49b5-44a7-98f0-bf6c900e1ad1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.989364] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1881.989599] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1881.990551] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f3ccde02-df3d-454b-80ad-09ac0e42c145 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1881.996728] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 1881.996728] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5243e10b-92db-e803-87b7-26c60b3237c6" [ 1881.996728] env[68492]: _type = "Task" [ 1881.996728] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1882.006641] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5243e10b-92db-e803-87b7-26c60b3237c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1882.051816] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1882.052053] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1882.052239] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleting the datastore file [datastore2] fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1882.052505] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-510901d3-4ffb-4f04-9e43-bbf2215651a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.058169] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 1882.058169] env[68492]: value = "task-3395555" [ 1882.058169] env[68492]: _type = "Task" [ 1882.058169] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1882.065423] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395555, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1882.506662] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1882.506964] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating directory with path [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1882.507159] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3c31873-8214-49d9-8971-83ea51331e53 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.517943] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created directory with path [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1882.518146] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Fetch image to [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1882.518314] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1882.519007] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0878a729-fc5b-4bfc-b4ac-a0bf33284cb9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.525092] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52f1a45c-5cbe-48af-bb50-64239b33daf1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.533776] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbccb808-75e1-4da2-acb3-5936901ef991 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.566672] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dd3d8a5-db42-4009-b3db-025d30719cc8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.573099] env[68492]: DEBUG oslo_vmware.api [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395555, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074259} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1882.574465] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1882.574653] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1882.574824] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1882.574993] env[68492]: INFO nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1882.576694] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-14ba854f-b20b-43ee-a83e-d508743f597c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.578515] env[68492]: DEBUG nova.compute.claims [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1882.578681] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1882.578893] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1882.598432] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1882.652295] env[68492]: DEBUG oslo_vmware.rw_handles [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1882.711965] env[68492]: DEBUG oslo_vmware.rw_handles [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1882.712174] env[68492]: DEBUG oslo_vmware.rw_handles [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1882.830817] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d610ebd6-ca38-4863-b5bb-5e8fe2833abf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.838244] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49fba71-8e24-4a90-aa68-0068d0ff17fc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.867088] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75ede037-db39-4205-af5d-72f333d15eb6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.873877] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b96b1a8-e920-4062-9219-b0ecc89412fd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.887446] env[68492]: DEBUG nova.compute.provider_tree [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1882.895419] env[68492]: DEBUG nova.scheduler.client.report [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1882.908478] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.330s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1882.908994] env[68492]: ERROR nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1882.908994] env[68492]: Faults: ['InvalidArgument'] [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Traceback (most recent call last): [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self.driver.spawn(context, instance, image_meta, [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self._fetch_image_if_missing(context, vi) [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] image_cache(vi, tmp_image_ds_loc) [ 1882.908994] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] vm_util.copy_virtual_disk( [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] session._wait_for_task(vmdk_copy_task) [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return self.wait_for_task(task_ref) [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return evt.wait() [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] result = hub.switch() [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] return self.greenlet.switch() [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1882.909630] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] self.f(*self.args, **self.kw) [ 1882.910243] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1882.910243] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] raise exceptions.translate_fault(task_info.error) [ 1882.910243] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1882.910243] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Faults: ['InvalidArgument'] [ 1882.910243] env[68492]: ERROR nova.compute.manager [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] [ 1882.910243] env[68492]: DEBUG nova.compute.utils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1882.911040] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Build of instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 was re-scheduled: A specified parameter was not correct: fileType [ 1882.911040] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1882.911418] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1882.911586] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1882.911755] env[68492]: DEBUG nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1882.911918] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1883.209243] env[68492]: DEBUG nova.network.neutron [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1883.220870] env[68492]: INFO nova.compute.manager [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Took 0.31 seconds to deallocate network for instance. [ 1883.308501] env[68492]: INFO nova.scheduler.client.report [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleted allocations for instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 [ 1883.331387] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4f985d4f-0703-4716-831a-fd77a36c8f5c tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 684.329s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1883.333030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.673s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1883.333030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1883.333030] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1883.333259] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1883.335131] env[68492]: INFO nova.compute.manager [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Terminating instance [ 1883.336821] env[68492]: DEBUG nova.compute.manager [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1883.337060] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1883.337593] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-edd17a4c-bece-454d-872c-2565f773d521 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.342366] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1883.348893] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87cc432a-b1ed-467f-b006-95873e66ec06 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.378149] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5 could not be found. [ 1883.378365] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1883.378551] env[68492]: INFO nova.compute.manager [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1883.378841] env[68492]: DEBUG oslo.service.loopingcall [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1883.379624] env[68492]: DEBUG nova.compute.manager [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1883.379723] env[68492]: DEBUG nova.network.neutron [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1883.395574] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1883.395851] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1883.397254] env[68492]: INFO nova.compute.claims [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1883.406307] env[68492]: DEBUG nova.network.neutron [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1883.416423] env[68492]: INFO nova.compute.manager [-] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] Took 0.04 seconds to deallocate network for instance. [ 1883.527322] env[68492]: DEBUG oslo_concurrency.lockutils [None req-b38975aa-2021-4306-9e04-ad8246c90cd6 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1883.528264] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 115.282s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1883.529319] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1883.529765] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "fd99ede7-d9a0-46a1-85bd-0c3fdbc1e5c5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1883.576773] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e67a1bed-2f00-4f49-9d4a-0bd196e1e140 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.584608] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b096fe2a-8885-45a1-95c0-cf6f4edf00aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.616935] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfa61edf-f7ff-47a8-8bcb-274f924829a6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.624512] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-553c9ec4-c2fb-4f40-9408-f66b1951a9ca {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.637557] env[68492]: DEBUG nova.compute.provider_tree [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1883.645480] env[68492]: DEBUG nova.scheduler.client.report [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1883.660019] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1883.660505] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1883.698901] env[68492]: DEBUG nova.compute.utils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1883.700178] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1883.700351] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1883.708097] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1883.756962] env[68492]: DEBUG nova.policy [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eca85f521b2f4a9c9ecf05120198f3de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4155239cd01a410fa600f06c709fe5c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1883.768250] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1883.793596] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1883.793596] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1883.793973] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1883.794065] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1883.794201] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1883.794352] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1883.794687] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1883.794763] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1883.794948] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1883.795280] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1883.795518] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1883.796387] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a636fe45-bcba-4319-85ea-34c6f54d15a8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.804573] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aa25cbb-8a95-4d5e-9271-992e7a0c1462 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1884.396478] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Successfully created port: 0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1884.949553] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Successfully updated port: 0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1884.960710] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1884.960874] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1884.961033] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1884.996403] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1885.146374] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Updating instance_info_cache with network_info: [{"id": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "address": "fa:16:3e:68:3b:5b", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b8ae3ef-34", "ovs_interfaceid": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1885.158856] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Releasing lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1885.159152] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance network_info: |[{"id": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "address": "fa:16:3e:68:3b:5b", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b8ae3ef-34", "ovs_interfaceid": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1885.159525] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:3b:5b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0b8ae3ef-3489-4a55-bf11-4dd1bf73a921', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1885.167201] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating folder: Project (4155239cd01a410fa600f06c709fe5c6). Parent ref: group-v677434. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1885.167670] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5c9d769c-3bcc-4d67-abf7-cb484f3fe62b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.179043] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created folder: Project (4155239cd01a410fa600f06c709fe5c6) in parent group-v677434. [ 1885.179137] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating folder: Instances. Parent ref: group-v677549. {{(pid=68492) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1885.179359] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41b4b608-147e-4b99-8489-9c99063dc941 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.187097] env[68492]: INFO nova.virt.vmwareapi.vm_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created folder: Instances in parent group-v677549. [ 1885.187330] env[68492]: DEBUG oslo.service.loopingcall [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1885.187531] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1885.187702] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bf2ee7e9-9dfb-442c-8088-433d3d820918 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.206288] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1885.206288] env[68492]: value = "task-3395558" [ 1885.206288] env[68492]: _type = "Task" [ 1885.206288] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1885.213670] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395558, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1885.301391] env[68492]: DEBUG nova.compute.manager [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Received event network-vif-plugged-0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1885.301391] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Acquiring lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1885.301391] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1885.301391] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1885.301563] env[68492]: DEBUG nova.compute.manager [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] No waiting events found dispatching network-vif-plugged-0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1885.301682] env[68492]: WARNING nova.compute.manager [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Received unexpected event network-vif-plugged-0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 for instance with vm_state building and task_state spawning. [ 1885.301843] env[68492]: DEBUG nova.compute.manager [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Received event network-changed-0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1885.302031] env[68492]: DEBUG nova.compute.manager [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Refreshing instance network info cache due to event network-changed-0b8ae3ef-3489-4a55-bf11-4dd1bf73a921. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1885.302225] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Acquiring lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1885.302360] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Acquired lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1885.302514] env[68492]: DEBUG nova.network.neutron [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Refreshing network info cache for port 0b8ae3ef-3489-4a55-bf11-4dd1bf73a921 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1885.611661] env[68492]: DEBUG nova.network.neutron [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Updated VIF entry in instance network info cache for port 0b8ae3ef-3489-4a55-bf11-4dd1bf73a921. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1885.612019] env[68492]: DEBUG nova.network.neutron [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Updating instance_info_cache with network_info: [{"id": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "address": "fa:16:3e:68:3b:5b", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b8ae3ef-34", "ovs_interfaceid": "0b8ae3ef-3489-4a55-bf11-4dd1bf73a921", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1885.621351] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f80e03a-b686-42e2-a686-8e91640df578 req-e735d9a5-bdc4-4016-a7b4-a4ceff602e4e service nova] Releasing lock "refresh_cache-c472a34d-b388-46c9-a7e0-7106b0666478" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1885.716732] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395558, 'name': CreateVM_Task, 'duration_secs': 0.294638} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1885.716882] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1885.717497] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1885.717673] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1885.717990] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1885.718258] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ce11e60-5713-4b03-964d-d6fd2e7b6ef6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1885.722422] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 1885.722422] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52192545-894c-7210-60a2-12fb87b4d427" [ 1885.722422] env[68492]: _type = "Task" [ 1885.722422] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1885.729386] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52192545-894c-7210-60a2-12fb87b4d427, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1886.233208] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1886.233514] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1886.233676] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.232392] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1910.232527] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1910.232808] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1910.232808] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1910.254838] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255060] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255150] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255236] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255357] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255476] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255595] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255732] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255838] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.255956] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1910.256077] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1910.256572] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1912.231423] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1912.243585] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1912.243802] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.243971] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1912.244138] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1912.245256] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc36d24c-9cfe-45b3-8f73-6e3cb0e5fda6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.254195] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33a93039-c14b-45d7-a53e-97674e514a12 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.269139] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a864c2-461f-4a9d-9673-12b55e150374 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.275277] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2012967f-aa64-4fa3-a7f2-0363cfdf13bf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.303719] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1912.303858] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1912.304059] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.374904] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 03afef99-e2dd-4467-8426-fbe50481aa6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375048] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375177] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375299] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375417] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375532] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375645] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375757] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375868] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.375978] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1912.388778] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 1912.388988] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1912.389150] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1912.514120] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d8d22a7-2105-4832-b41e-86f3f583f0a8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.521560] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aae6451-889b-411a-b71f-4961ab178a15 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.552721] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76865337-f942-412e-8cf7-0eb7d89dfcd4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.559525] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8683979a-c446-4322-8760-49c074e551c3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.572301] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1912.580653] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1912.593345] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1912.593518] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.289s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1914.589379] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.589747] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.589747] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.589938] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.590067] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.590219] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1927.781234] env[68492]: WARNING oslo_vmware.rw_handles [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1927.781234] env[68492]: ERROR oslo_vmware.rw_handles [ 1927.781942] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1927.783657] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1927.783926] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Copying Virtual Disk [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/83fd77ea-511a-4bfe-ad1a-feaa665317fa/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1927.784240] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-db9c5f0b-d544-49b3-b58e-6e8dbec794aa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.792644] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 1927.792644] env[68492]: value = "task-3395559" [ 1927.792644] env[68492]: _type = "Task" [ 1927.792644] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1927.800865] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395559, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1928.303264] env[68492]: DEBUG oslo_vmware.exceptions [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1928.303504] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1928.304062] env[68492]: ERROR nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1928.304062] env[68492]: Faults: ['InvalidArgument'] [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Traceback (most recent call last): [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] yield resources [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self.driver.spawn(context, instance, image_meta, [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self._fetch_image_if_missing(context, vi) [ 1928.304062] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] image_cache(vi, tmp_image_ds_loc) [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] vm_util.copy_virtual_disk( [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] session._wait_for_task(vmdk_copy_task) [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return self.wait_for_task(task_ref) [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return evt.wait() [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] result = hub.switch() [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1928.304531] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return self.greenlet.switch() [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self.f(*self.args, **self.kw) [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] raise exceptions.translate_fault(task_info.error) [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Faults: ['InvalidArgument'] [ 1928.304939] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] [ 1928.304939] env[68492]: INFO nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Terminating instance [ 1928.305939] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1928.306172] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1928.306412] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3fc084bf-8f2d-4dd3-bf59-a6c2663d86c7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.308784] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1928.309036] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1928.309776] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0d4345-dd8a-4471-94d8-bfc108e34863 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.316283] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1928.316488] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-46e0eeeb-f413-483f-bcd1-1970c26e2fd6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.318580] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1928.318751] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1928.319658] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c2e36cb-08d1-46bf-a5f1-3959bbdd6989 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.324499] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for the task: (returnval){ [ 1928.324499] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c40136-3fb9-6cab-c1e0-fa94c561bce0" [ 1928.324499] env[68492]: _type = "Task" [ 1928.324499] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1928.331447] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52c40136-3fb9-6cab-c1e0-fa94c561bce0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1928.385101] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1928.385329] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1928.385491] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleting the datastore file [datastore2] 03afef99-e2dd-4467-8426-fbe50481aa6f {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1928.385797] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-89be8211-d92d-4753-a986-f970a98b8411 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.392444] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 1928.392444] env[68492]: value = "task-3395561" [ 1928.392444] env[68492]: _type = "Task" [ 1928.392444] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1928.400066] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395561, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1928.835299] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1928.835721] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Creating directory with path [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1928.835804] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-02569907-da99-40fe-8baa-3dccf37ae148 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.848434] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Created directory with path [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1928.848612] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Fetch image to [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1928.848780] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1928.849499] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd60668a-394e-4869-8dc8-a5b178d7f688 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.855876] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55180bcf-af13-40f5-afa4-787d72f1e73f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.865572] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e822230d-db49-486f-a514-7825ca9e45af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.897602] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05b14404-2e5b-4b63-b190-cb08f621f73f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.904433] env[68492]: DEBUG oslo_vmware.api [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395561, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074907} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1928.905873] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1928.906078] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1928.906255] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1928.906428] env[68492]: INFO nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1928.908161] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0cf1a983-cbcb-4542-b9a2-b84f3431943d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1928.910015] env[68492]: DEBUG nova.compute.claims [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1928.910189] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1928.910399] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1928.934874] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1929.081024] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dad4bf9e-3688-466e-94e9-54f295047af1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.089648] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4654641d-2092-46ba-94a3-b42feb282129 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.121512] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a82f423b-fa83-4aa4-94f4-63c23bad6343 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.129621] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33e199fb-d985-4e52-a9c8-ff12bb7a612e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.143676] env[68492]: DEBUG nova.compute.provider_tree [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1929.153262] env[68492]: DEBUG nova.scheduler.client.report [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1929.167851] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.257s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1929.168437] env[68492]: ERROR nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1929.168437] env[68492]: Faults: ['InvalidArgument'] [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Traceback (most recent call last): [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self.driver.spawn(context, instance, image_meta, [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self._fetch_image_if_missing(context, vi) [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] image_cache(vi, tmp_image_ds_loc) [ 1929.168437] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] vm_util.copy_virtual_disk( [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] session._wait_for_task(vmdk_copy_task) [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return self.wait_for_task(task_ref) [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return evt.wait() [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] result = hub.switch() [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] return self.greenlet.switch() [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1929.168856] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] self.f(*self.args, **self.kw) [ 1929.169239] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1929.169239] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] raise exceptions.translate_fault(task_info.error) [ 1929.169239] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1929.169239] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Faults: ['InvalidArgument'] [ 1929.169239] env[68492]: ERROR nova.compute.manager [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] [ 1929.169239] env[68492]: DEBUG nova.compute.utils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1929.170874] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Build of instance 03afef99-e2dd-4467-8426-fbe50481aa6f was re-scheduled: A specified parameter was not correct: fileType [ 1929.170874] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1929.171288] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1929.171463] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1929.171637] env[68492]: DEBUG nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1929.171793] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1929.263522] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1929.324912] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1929.325179] env[68492]: DEBUG oslo_vmware.rw_handles [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1929.524264] env[68492]: DEBUG nova.network.neutron [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1929.539275] env[68492]: INFO nova.compute.manager [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Took 0.37 seconds to deallocate network for instance. [ 1929.638837] env[68492]: INFO nova.scheduler.client.report [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleted allocations for instance 03afef99-e2dd-4467-8426-fbe50481aa6f [ 1929.661597] env[68492]: DEBUG oslo_concurrency.lockutils [None req-815759a4-4324-4b45-a6bd-28b7cc3b293f tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 683.607s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1929.662704] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.348s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1929.662931] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1929.663151] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1929.663316] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1929.666818] env[68492]: INFO nova.compute.manager [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Terminating instance [ 1929.668501] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1929.669020] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1929.669020] env[68492]: DEBUG nova.network.neutron [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1929.680362] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1929.694390] env[68492]: DEBUG nova.network.neutron [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1929.737687] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1929.738034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1929.739602] env[68492]: INFO nova.compute.claims [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1929.848752] env[68492]: DEBUG nova.network.neutron [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1929.857178] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "refresh_cache-03afef99-e2dd-4467-8426-fbe50481aa6f" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1929.857631] env[68492]: DEBUG nova.compute.manager [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1929.857852] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1929.858404] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6e7157e9-9649-4e31-b279-a8b230bacbcb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.869748] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3567f33d-5383-4846-b1be-0b67f674042f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.899562] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 03afef99-e2dd-4467-8426-fbe50481aa6f could not be found. [ 1929.899763] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1929.899968] env[68492]: INFO nova.compute.manager [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1929.900228] env[68492]: DEBUG oslo.service.loopingcall [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1929.902454] env[68492]: DEBUG nova.compute.manager [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1929.902566] env[68492]: DEBUG nova.network.neutron [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1929.918485] env[68492]: DEBUG nova.network.neutron [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1929.923301] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fba618e-ede1-410f-ba51-7da1fc10bb06 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.926203] env[68492]: DEBUG nova.network.neutron [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1929.932515] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afcdba67-3993-4214-a539-5cdcabdabfaa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.936067] env[68492]: INFO nova.compute.manager [-] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] Took 0.03 seconds to deallocate network for instance. [ 1929.965442] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f542bc1-508b-4a5e-b687-3c6af6d927ab {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.975291] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2e9d7c-4543-4f31-8ec7-c8af24e698e9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1929.990211] env[68492]: DEBUG nova.compute.provider_tree [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1929.999602] env[68492]: DEBUG nova.scheduler.client.report [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1930.010972] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1930.011500] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1930.047140] env[68492]: DEBUG nova.compute.utils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1930.048619] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1930.048798] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1930.052632] env[68492]: DEBUG oslo_concurrency.lockutils [None req-140df3a9-8ec7-43ce-8e21-86b0a8c1e5fe tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.390s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1930.053671] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 161.807s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1930.053862] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 03afef99-e2dd-4467-8426-fbe50481aa6f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1930.054043] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "03afef99-e2dd-4467-8426-fbe50481aa6f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1930.057605] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1930.117869] env[68492]: DEBUG nova.policy [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eca85f521b2f4a9c9ecf05120198f3de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4155239cd01a410fa600f06c709fe5c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1930.121315] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1930.145937] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1930.146185] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1930.146337] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1930.146512] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1930.146655] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1930.146800] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1930.147016] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1930.147179] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1930.147341] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1930.147495] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1930.147661] env[68492]: DEBUG nova.virt.hardware [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1930.148529] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7494548f-ef17-4392-97bb-968f1c790bdd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1930.156367] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94e93de4-3f08-4869-985c-62db26231a2c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1930.605346] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Successfully created port: d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1931.447656] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Successfully updated port: d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1931.459759] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1931.459966] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1931.460366] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1931.523129] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1931.694622] env[68492]: DEBUG nova.compute.manager [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Received event network-vif-plugged-d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1931.694927] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Acquiring lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1931.695157] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1931.695327] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1931.695492] env[68492]: DEBUG nova.compute.manager [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] No waiting events found dispatching network-vif-plugged-d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1931.696168] env[68492]: WARNING nova.compute.manager [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Received unexpected event network-vif-plugged-d44678e1-d3bf-4fc3-8715-8a559b85f200 for instance with vm_state building and task_state spawning. [ 1931.696168] env[68492]: DEBUG nova.compute.manager [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Received event network-changed-d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1931.696168] env[68492]: DEBUG nova.compute.manager [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Refreshing instance network info cache due to event network-changed-d44678e1-d3bf-4fc3-8715-8a559b85f200. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1931.696463] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Acquiring lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1931.809214] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Updating instance_info_cache with network_info: [{"id": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "address": "fa:16:3e:ed:73:b2", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44678e1-d3", "ovs_interfaceid": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1931.820018] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Releasing lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1931.820332] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Instance network_info: |[{"id": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "address": "fa:16:3e:ed:73:b2", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44678e1-d3", "ovs_interfaceid": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1931.820632] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Acquired lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1931.820812] env[68492]: DEBUG nova.network.neutron [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Refreshing network info cache for port d44678e1-d3bf-4fc3-8715-8a559b85f200 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1931.821838] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ed:73:b2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd44678e1-d3bf-4fc3-8715-8a559b85f200', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1931.829190] env[68492]: DEBUG oslo.service.loopingcall [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1931.830229] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1931.832543] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-28e9e5af-09ba-44c7-bdf4-638ca11e10cd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1931.852941] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1931.852941] env[68492]: value = "task-3395562" [ 1931.852941] env[68492]: _type = "Task" [ 1931.852941] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1931.860618] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395562, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1932.077983] env[68492]: DEBUG nova.network.neutron [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Updated VIF entry in instance network info cache for port d44678e1-d3bf-4fc3-8715-8a559b85f200. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1932.078375] env[68492]: DEBUG nova.network.neutron [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Updating instance_info_cache with network_info: [{"id": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "address": "fa:16:3e:ed:73:b2", "network": {"id": "a121fe2c-9259-4f9f-8efa-2b73b77cfbb7", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1993819807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4155239cd01a410fa600f06c709fe5c6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44678e1-d3", "ovs_interfaceid": "d44678e1-d3bf-4fc3-8715-8a559b85f200", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1932.088752] env[68492]: DEBUG oslo_concurrency.lockutils [req-375c5dc6-b499-4c13-b297-689286f72cf6 req-fa10baf5-da39-428d-9b12-b33f0f26258a service nova] Releasing lock "refresh_cache-ffddeec8-4442-413c-a0a0-2cf2b110cf14" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1932.363370] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395562, 'name': CreateVM_Task, 'duration_secs': 0.288816} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1932.363533] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1932.364216] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1932.364376] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1932.364687] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1932.364935] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2fc6d6bf-9ea4-4450-8cd9-44f337e0568b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1932.369528] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 1932.369528] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52f85385-f0a8-09d9-8c0b-18d64f3c2901" [ 1932.369528] env[68492]: _type = "Task" [ 1932.369528] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1932.376795] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52f85385-f0a8-09d9-8c0b-18d64f3c2901, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1932.879499] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1932.879819] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1932.879982] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1967.232715] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1971.231717] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1971.232137] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 1971.232137] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 1971.253893] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254119] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254173] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254297] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254418] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254532] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254650] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254767] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.254885] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.255010] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 1971.255151] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 1971.255614] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.231055] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.243347] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.243561] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.243731] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1973.243887] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1973.245273] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ba4ca1-bb2a-4dcb-abf0-4d63999a5119 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.257960] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0636b147-7c53-4e4d-88e3-5b92afc292b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.273900] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4cda047-088f-4cf0-893a-e327fbcb347f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.280770] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3493899-0d6e-4e50-b8b0-483c4b6c90e3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.309926] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180957MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1973.310200] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.310314] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.386334] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.386503] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.386630] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.386753] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.386875] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.386992] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.387130] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.387247] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.387361] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.387473] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 1973.387670] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1973.387807] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1973.525056] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a823bcd4-b4a9-4d20-b844-15b90d805174 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.533017] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71243ead-203b-4203-9ddc-8d2fff8617da {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.563094] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc9cffac-bc04-4c24-9cc9-ea7ba35b2b11 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.570769] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b15bf042-c76c-4ad9-965e-77a6ccc78423 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.584344] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1973.594028] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1973.607395] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1973.607651] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.608536] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.608953] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.608953] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.609076] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.609226] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 1976.227646] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1976.227916] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1977.585802] env[68492]: WARNING oslo_vmware.rw_handles [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1977.585802] env[68492]: ERROR oslo_vmware.rw_handles [ 1977.586745] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1977.588577] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1977.588809] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Copying Virtual Disk [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/bff64a58-de2d-44fc-9f22-5ade6441b20d/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1977.589106] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-66abf5d7-4c27-43e0-81b4-62de6f826e3b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.596690] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for the task: (returnval){ [ 1977.596690] env[68492]: value = "task-3395563" [ 1977.596690] env[68492]: _type = "Task" [ 1977.596690] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1977.604156] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Task: {'id': task-3395563, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1978.108528] env[68492]: DEBUG oslo_vmware.exceptions [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1978.108811] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1978.109366] env[68492]: ERROR nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1978.109366] env[68492]: Faults: ['InvalidArgument'] [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Traceback (most recent call last): [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] yield resources [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self.driver.spawn(context, instance, image_meta, [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self._fetch_image_if_missing(context, vi) [ 1978.109366] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] image_cache(vi, tmp_image_ds_loc) [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] vm_util.copy_virtual_disk( [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] session._wait_for_task(vmdk_copy_task) [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return self.wait_for_task(task_ref) [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return evt.wait() [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] result = hub.switch() [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1978.109819] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return self.greenlet.switch() [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self.f(*self.args, **self.kw) [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] raise exceptions.translate_fault(task_info.error) [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Faults: ['InvalidArgument'] [ 1978.110354] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] [ 1978.110354] env[68492]: INFO nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Terminating instance [ 1978.112017] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1978.112017] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1978.112017] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc30ee02-69d3-4ac2-a43a-98653f08d203 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.113892] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1978.114094] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1978.114834] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dfb05f7-114e-4dea-9e81-e0482902179a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.121597] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1978.121818] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e569ae1d-bcef-4df9-b532-169637bf3eb8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.123988] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1978.124170] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1978.125128] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b89a4121-75e2-40f3-b6ef-a5c0471b3289 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.129690] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for the task: (returnval){ [ 1978.129690] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e4fcd1-acb3-9da0-7587-377360a71d2f" [ 1978.129690] env[68492]: _type = "Task" [ 1978.129690] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1978.136718] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52e4fcd1-acb3-9da0-7587-377360a71d2f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1978.195173] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1978.195391] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1978.195568] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Deleting the datastore file [datastore2] b0757e62-96ca-4758-8444-dcc98fbf0a29 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1978.195846] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7f2cea16-0cc3-409d-aa6f-cebca7738520 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.201566] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for the task: (returnval){ [ 1978.201566] env[68492]: value = "task-3395565" [ 1978.201566] env[68492]: _type = "Task" [ 1978.201566] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1978.209907] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Task: {'id': task-3395565, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1978.640505] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1978.640946] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Creating directory with path [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1978.641032] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80448a88-49cb-4743-9983-4c533d901c2d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.652897] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Created directory with path [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1978.653098] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Fetch image to [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1978.653264] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1978.653981] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24c1a848-e6dc-4c75-a78a-e6bf2ecea64d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.660498] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54421404-531b-42b8-b2bf-3d41a925fcc5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.669213] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f63d9bc2-c0bb-4198-982b-bf7bea9f68c4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.700191] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b323da02-19b3-4e36-8d0e-8db0b453611a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.711923] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-716cfd15-9b20-4a69-b8ce-0813744667f2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1978.713527] env[68492]: DEBUG oslo_vmware.api [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Task: {'id': task-3395565, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068268} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1978.713760] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1978.713937] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1978.714116] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1978.714312] env[68492]: INFO nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1978.716429] env[68492]: DEBUG nova.compute.claims [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1978.716648] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1978.716887] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1978.734324] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1978.868346] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1978.957612] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1978.957961] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1978.994923] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d6b8c07-9af9-4538-b0fe-38acde36e080 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.003973] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0460b28-e5a4-48b5-86d8-776e229e00b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.034280] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7095c221-aa09-42b6-9d04-0592bce80cd1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.041175] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c44f141-f39c-4fb1-9b67-4513320b01d1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.053973] env[68492]: DEBUG nova.compute.provider_tree [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1979.062171] env[68492]: DEBUG nova.scheduler.client.report [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1979.077016] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.360s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1979.077604] env[68492]: ERROR nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1979.077604] env[68492]: Faults: ['InvalidArgument'] [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Traceback (most recent call last): [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self.driver.spawn(context, instance, image_meta, [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self._fetch_image_if_missing(context, vi) [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] image_cache(vi, tmp_image_ds_loc) [ 1979.077604] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] vm_util.copy_virtual_disk( [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] session._wait_for_task(vmdk_copy_task) [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return self.wait_for_task(task_ref) [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return evt.wait() [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] result = hub.switch() [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] return self.greenlet.switch() [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1979.077971] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] self.f(*self.args, **self.kw) [ 1979.078417] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1979.078417] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] raise exceptions.translate_fault(task_info.error) [ 1979.078417] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1979.078417] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Faults: ['InvalidArgument'] [ 1979.078417] env[68492]: ERROR nova.compute.manager [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] [ 1979.078417] env[68492]: DEBUG nova.compute.utils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1979.079706] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Build of instance b0757e62-96ca-4758-8444-dcc98fbf0a29 was re-scheduled: A specified parameter was not correct: fileType [ 1979.079706] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1979.080083] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1979.080281] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1979.080451] env[68492]: DEBUG nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1979.080615] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1979.603181] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1979.604034] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1979.632576] env[68492]: DEBUG nova.network.neutron [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1979.644570] env[68492]: INFO nova.compute.manager [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Took 0.56 seconds to deallocate network for instance. [ 1979.754404] env[68492]: INFO nova.scheduler.client.report [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Deleted allocations for instance b0757e62-96ca-4758-8444-dcc98fbf0a29 [ 1979.772605] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1524a8f7-b577-4a20-afbe-4c6ec276d774 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.965s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1979.773705] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.236s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1979.773915] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Acquiring lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1979.774138] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1979.774309] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1979.776503] env[68492]: INFO nova.compute.manager [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Terminating instance [ 1979.778126] env[68492]: DEBUG nova.compute.manager [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1979.778315] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1979.778831] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-562963d2-1d7f-48cf-ad63-ac28d3457e0b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.789081] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88de68d9-46db-444c-bca9-22cc3d567ef2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1979.800035] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1979.820828] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b0757e62-96ca-4758-8444-dcc98fbf0a29 could not be found. [ 1979.821038] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1979.821224] env[68492]: INFO nova.compute.manager [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1979.821497] env[68492]: DEBUG oslo.service.loopingcall [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1979.821764] env[68492]: DEBUG nova.compute.manager [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1979.821880] env[68492]: DEBUG nova.network.neutron [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1979.856572] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1979.856872] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1979.858611] env[68492]: INFO nova.compute.claims [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1979.861505] env[68492]: DEBUG nova.network.neutron [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1979.871842] env[68492]: INFO nova.compute.manager [-] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] Took 0.05 seconds to deallocate network for instance. [ 1979.974185] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1cbdaf71-bc72-4f51-a272-972fd95e51c2 tempest-ServerMetadataNegativeTestJSON-445016512 tempest-ServerMetadataNegativeTestJSON-445016512-project-member] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1979.978018] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 211.728s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1979.978018] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: b0757e62-96ca-4758-8444-dcc98fbf0a29] During sync_power_state the instance has a pending task (deleting). Skip. [ 1979.978018] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "b0757e62-96ca-4758-8444-dcc98fbf0a29" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1980.050110] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1ae5df-454f-4c1e-9e30-af9d5dca5b61 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.058024] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e379479-3a68-4a46-8d66-99acbef1344a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.088811] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8861cfe5-371e-4397-a397-d3fac5c76779 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.095960] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-944746a8-affa-4336-8035-2f2c46378a21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.109293] env[68492]: DEBUG nova.compute.provider_tree [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1980.118424] env[68492]: DEBUG nova.scheduler.client.report [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1980.132102] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1980.132586] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1980.164583] env[68492]: DEBUG nova.compute.utils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1980.165767] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1980.165933] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1980.178073] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1980.238032] env[68492]: DEBUG nova.policy [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7bf86f7359545ebbf45a5a002c88e5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '839d10b6a7894af08ca3717477bcd473', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 1980.241191] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1980.266872] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1980.267135] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1980.267297] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1980.267477] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1980.267622] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1980.267766] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1980.267974] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1980.268218] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1980.268431] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1980.268606] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1980.268777] env[68492]: DEBUG nova.virt.hardware [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1980.269658] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d6a7ea9-6b33-407c-a884-6918e7f76343 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.277708] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbe03e99-3350-4f1b-bed2-f9b3921195a1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1980.514450] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Successfully created port: 2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1981.217513] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Successfully updated port: 2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1981.239163] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1981.239315] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1981.239459] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1981.316655] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1981.503293] env[68492]: DEBUG nova.network.neutron [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Updating instance_info_cache with network_info: [{"id": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "address": "fa:16:3e:0f:c4:2a", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2357c9d2-57", "ovs_interfaceid": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1981.517107] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1981.517401] env[68492]: DEBUG nova.compute.manager [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Instance network_info: |[{"id": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "address": "fa:16:3e:0f:c4:2a", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2357c9d2-57", "ovs_interfaceid": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1981.517999] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:c4:2a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '310b8ba9-edca-4135-863e-f4a786dd4a77', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2357c9d2-5769-41c9-ac6b-3a94de1d8412', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1981.525503] env[68492]: DEBUG oslo.service.loopingcall [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1981.526013] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1981.526247] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5f0a9acf-c741-4ab1-898a-02ee48f9e5c3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1981.547472] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1981.547472] env[68492]: value = "task-3395566" [ 1981.547472] env[68492]: _type = "Task" [ 1981.547472] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1981.555212] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395566, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1981.668738] env[68492]: DEBUG nova.compute.manager [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Received event network-vif-plugged-2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1981.668866] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Acquiring lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1981.669074] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1981.669225] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1981.669406] env[68492]: DEBUG nova.compute.manager [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] No waiting events found dispatching network-vif-plugged-2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1981.669610] env[68492]: WARNING nova.compute.manager [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Received unexpected event network-vif-plugged-2357c9d2-5769-41c9-ac6b-3a94de1d8412 for instance with vm_state building and task_state spawning. [ 1981.669737] env[68492]: DEBUG nova.compute.manager [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Received event network-changed-2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1981.669907] env[68492]: DEBUG nova.compute.manager [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Refreshing instance network info cache due to event network-changed-2357c9d2-5769-41c9-ac6b-3a94de1d8412. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 1981.670112] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Acquiring lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1981.670284] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Acquired lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1981.670463] env[68492]: DEBUG nova.network.neutron [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Refreshing network info cache for port 2357c9d2-5769-41c9-ac6b-3a94de1d8412 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1982.026396] env[68492]: DEBUG nova.network.neutron [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Updated VIF entry in instance network info cache for port 2357c9d2-5769-41c9-ac6b-3a94de1d8412. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1982.026794] env[68492]: DEBUG nova.network.neutron [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Updating instance_info_cache with network_info: [{"id": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "address": "fa:16:3e:0f:c4:2a", "network": {"id": "bd082c7d-8e55-420f-b93b-cb3b37670856", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-100048437-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "839d10b6a7894af08ca3717477bcd473", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "310b8ba9-edca-4135-863e-f4a786dd4a77", "external-id": "nsx-vlan-transportzone-768", "segmentation_id": 768, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2357c9d2-57", "ovs_interfaceid": "2357c9d2-5769-41c9-ac6b-3a94de1d8412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1982.036015] env[68492]: DEBUG oslo_concurrency.lockutils [req-0f7a5757-a2bd-4fa8-8f56-660cb973a940 req-d1865db2-1e51-4ecf-ab45-4c860835bbd6 service nova] Releasing lock "refresh_cache-75bbcae2-54ab-47d2-9bf8-b55b0881fb90" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1982.057272] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395566, 'name': CreateVM_Task, 'duration_secs': 0.27932} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1982.057429] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1982.058048] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1982.058214] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1982.058521] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1982.058764] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bba47c0e-8953-4504-82df-cd1662892fff {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.062976] env[68492]: DEBUG oslo_vmware.api [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 1982.062976] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524964b8-3a05-06fc-dec8-3789b218f0e2" [ 1982.062976] env[68492]: _type = "Task" [ 1982.062976] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1982.070906] env[68492]: DEBUG oslo_vmware.api [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]524964b8-3a05-06fc-dec8-3789b218f0e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1982.572529] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1982.572816] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1982.573011] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9ecda1a8-408a-4316-a1a5-969d91c76c3c tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1985.558265] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "696b560c-f4ed-4105-87e9-e5380a468fe1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1985.558568] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "696b560c-f4ed-4105-87e9-e5380a468fe1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2027.476666] env[68492]: WARNING oslo_vmware.rw_handles [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2027.476666] env[68492]: ERROR oslo_vmware.rw_handles [ 2027.477539] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2027.479492] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2027.479729] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Copying Virtual Disk [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/cf1ef3e3-3473-4959-9495-c5385104d669/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2027.480041] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8b8895b0-2a60-4bd1-bfe3-431041d6e67e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2027.487336] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for the task: (returnval){ [ 2027.487336] env[68492]: value = "task-3395567" [ 2027.487336] env[68492]: _type = "Task" [ 2027.487336] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2027.495371] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Task: {'id': task-3395567, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2027.999211] env[68492]: DEBUG oslo_vmware.exceptions [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2027.999440] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2027.999999] env[68492]: ERROR nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2027.999999] env[68492]: Faults: ['InvalidArgument'] [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Traceback (most recent call last): [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] yield resources [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self.driver.spawn(context, instance, image_meta, [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self._fetch_image_if_missing(context, vi) [ 2027.999999] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] image_cache(vi, tmp_image_ds_loc) [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] vm_util.copy_virtual_disk( [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] session._wait_for_task(vmdk_copy_task) [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return self.wait_for_task(task_ref) [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return evt.wait() [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] result = hub.switch() [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2028.000395] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return self.greenlet.switch() [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self.f(*self.args, **self.kw) [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] raise exceptions.translate_fault(task_info.error) [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Faults: ['InvalidArgument'] [ 2028.000863] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] [ 2028.000863] env[68492]: INFO nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Terminating instance [ 2028.001920] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2028.002148] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2028.002399] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9678fbd-860b-4837-936e-2c6e0fc5b70e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.004529] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2028.004721] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2028.005438] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da60ea23-34cb-4e04-8040-389391266aaf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.011932] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2028.012156] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8f37b83f-0f9a-4b61-9069-d07366cb0459 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.014288] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2028.014458] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2028.015432] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eff5cb11-8eac-43ea-b5e4-4e0cb245c092 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.020131] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for the task: (returnval){ [ 2028.020131] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d70298-b908-623c-5ea9-4e69f2fe5a6b" [ 2028.020131] env[68492]: _type = "Task" [ 2028.020131] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2028.029302] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52d70298-b908-623c-5ea9-4e69f2fe5a6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2028.078020] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2028.078269] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2028.078505] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Deleting the datastore file [datastore2] 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2028.078842] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b5c2df85-2108-49b0-88c4-5e4454df032f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.084617] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for the task: (returnval){ [ 2028.084617] env[68492]: value = "task-3395569" [ 2028.084617] env[68492]: _type = "Task" [ 2028.084617] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2028.091919] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Task: {'id': task-3395569, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2028.230731] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2028.530453] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2028.530814] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Creating directory with path [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2028.530964] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a198b528-e03a-4be2-9c2c-45a29cb59c3b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.541819] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Created directory with path [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2028.541996] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Fetch image to [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2028.542181] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2028.542884] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76a64cca-7ba4-4f9d-86fe-358a2a1a6a37 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.549094] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6cbf722-a075-40e9-9dc4-4e829f9aedbd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.557873] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf9c5d0-1d66-4b1b-b09e-ee4e54d1a07c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.590033] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9795a76f-a271-4bcd-93a5-ab50b63f7b8b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.596715] env[68492]: DEBUG oslo_vmware.api [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Task: {'id': task-3395569, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07384} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2028.598122] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2028.598312] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2028.598482] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2028.598651] env[68492]: INFO nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2028.600616] env[68492]: DEBUG nova.compute.claims [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2028.600783] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2028.600995] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2028.603411] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2977903b-5f00-4028-a4de-444c537d3ac2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.626071] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2028.753371] env[68492]: DEBUG oslo_vmware.rw_handles [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2028.816931] env[68492]: DEBUG oslo_vmware.rw_handles [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2028.817104] env[68492]: DEBUG oslo_vmware.rw_handles [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2028.846165] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-182ad192-a9e7-4fb5-8b9d-0293341e9ab6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.853930] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a66c44fb-ad50-423c-a969-aa1e44f0f045 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.883101] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-354f5801-7e5c-46ff-87be-3a179da2665b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.889910] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77bcd2e6-5ac1-4e2d-a634-580c7423d56b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2028.902558] env[68492]: DEBUG nova.compute.provider_tree [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2028.911932] env[68492]: DEBUG nova.scheduler.client.report [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2028.929137] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2028.929668] env[68492]: ERROR nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2028.929668] env[68492]: Faults: ['InvalidArgument'] [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Traceback (most recent call last): [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self.driver.spawn(context, instance, image_meta, [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self._fetch_image_if_missing(context, vi) [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] image_cache(vi, tmp_image_ds_loc) [ 2028.929668] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] vm_util.copy_virtual_disk( [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] session._wait_for_task(vmdk_copy_task) [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return self.wait_for_task(task_ref) [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return evt.wait() [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] result = hub.switch() [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] return self.greenlet.switch() [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2028.930049] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] self.f(*self.args, **self.kw) [ 2028.930421] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2028.930421] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] raise exceptions.translate_fault(task_info.error) [ 2028.930421] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2028.930421] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Faults: ['InvalidArgument'] [ 2028.930421] env[68492]: ERROR nova.compute.manager [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] [ 2028.930421] env[68492]: DEBUG nova.compute.utils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2028.931814] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Build of instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea was re-scheduled: A specified parameter was not correct: fileType [ 2028.931814] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2028.932221] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2028.932386] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2028.932555] env[68492]: DEBUG nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2028.932720] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2029.219257] env[68492]: DEBUG nova.network.neutron [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2029.233711] env[68492]: INFO nova.compute.manager [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Took 0.30 seconds to deallocate network for instance. [ 2029.350470] env[68492]: INFO nova.scheduler.client.report [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Deleted allocations for instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea [ 2029.372453] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a31b292f-226f-4bf6-a8c3-8d2d1b98f173 tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 653.294s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.373717] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 455.742s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.373862] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Acquiring lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2029.374068] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.374228] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.376150] env[68492]: INFO nova.compute.manager [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Terminating instance [ 2029.377740] env[68492]: DEBUG nova.compute.manager [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2029.377932] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2029.378394] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4c7a9301-f61b-4bd9-92f6-700cce763332 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.388478] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f7d0006-2a4f-4143-b999-a5bc293d1d9c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.399288] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2029.419076] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea could not be found. [ 2029.419335] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2029.419440] env[68492]: INFO nova.compute.manager [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2029.419675] env[68492]: DEBUG oslo.service.loopingcall [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2029.419913] env[68492]: DEBUG nova.compute.manager [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2029.420107] env[68492]: DEBUG nova.network.neutron [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2029.445542] env[68492]: DEBUG nova.network.neutron [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2029.449957] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2029.450596] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.451752] env[68492]: INFO nova.compute.claims [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2029.455424] env[68492]: INFO nova.compute.manager [-] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] Took 0.04 seconds to deallocate network for instance. [ 2029.540955] env[68492]: DEBUG oslo_concurrency.lockutils [None req-741d8660-7692-4601-b5b1-c5947ec0f70e tempest-ListServerFiltersTestJSON-822025726 tempest-ListServerFiltersTestJSON-822025726-project-member] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.167s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.542040] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 261.295s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.542040] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 66fcb02a-4f71-4adc-b73c-050f0b0eb0ea] During sync_power_state the instance has a pending task (deleting). Skip. [ 2029.542243] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "66fcb02a-4f71-4adc-b73c-050f0b0eb0ea" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.631172] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51650e57-35f1-4d5d-8b5a-528599adcdcd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.639089] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1a4c26-c655-4d81-8185-f5e9c22aaa18 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.669525] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-855ba12f-30cb-4c1a-b388-3eaefd69381b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.676474] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ab7a5b-1240-4e91-a5b0-11d6b8586faf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.689411] env[68492]: DEBUG nova.compute.provider_tree [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2029.698074] env[68492]: DEBUG nova.scheduler.client.report [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2029.712405] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.712839] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2029.753372] env[68492]: DEBUG nova.compute.utils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2029.754530] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2029.754701] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2029.762840] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2029.818651] env[68492]: DEBUG nova.policy [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '568ab24cbb7d4833bb8cdfd51db89db5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80fa34aee50b4509a18abca39075924a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 2029.823232] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2029.848732] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2029.848981] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2029.849154] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2029.849329] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2029.849472] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2029.849615] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2029.849817] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2029.849972] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2029.850150] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2029.850458] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2029.850545] env[68492]: DEBUG nova.virt.hardware [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2029.851373] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26587148-54f0-457d-af27-7543ba07e415 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.859807] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be593304-03ab-402c-9930-a3d5427bc01a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2030.183350] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Successfully created port: d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2031.083496] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Successfully updated port: d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2031.096733] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2031.096931] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2031.097050] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2031.151480] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2031.231514] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2031.231723] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2031.231865] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2031.253507] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.253681] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.253816] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.253949] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254090] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254213] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254390] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254542] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254726] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254851] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2031.254968] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2031.259713] env[68492]: DEBUG nova.compute.manager [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Received event network-vif-plugged-d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2031.259923] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Acquiring lock "696b560c-f4ed-4105-87e9-e5380a468fe1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2031.260138] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Lock "696b560c-f4ed-4105-87e9-e5380a468fe1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2031.260298] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Lock "696b560c-f4ed-4105-87e9-e5380a468fe1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2031.260461] env[68492]: DEBUG nova.compute.manager [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] No waiting events found dispatching network-vif-plugged-d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2031.260760] env[68492]: WARNING nova.compute.manager [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Received unexpected event network-vif-plugged-d369ab38-80e4-4adb-b30d-c470303ee25b for instance with vm_state building and task_state spawning. [ 2031.261010] env[68492]: DEBUG nova.compute.manager [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Received event network-changed-d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2031.261252] env[68492]: DEBUG nova.compute.manager [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Refreshing instance network info cache due to event network-changed-d369ab38-80e4-4adb-b30d-c470303ee25b. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2031.261427] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Acquiring lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2031.346313] env[68492]: DEBUG nova.network.neutron [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Updating instance_info_cache with network_info: [{"id": "d369ab38-80e4-4adb-b30d-c470303ee25b", "address": "fa:16:3e:39:0a:f0", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd369ab38-80", "ovs_interfaceid": "d369ab38-80e4-4adb-b30d-c470303ee25b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2031.358199] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2031.358481] env[68492]: DEBUG nova.compute.manager [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Instance network_info: |[{"id": "d369ab38-80e4-4adb-b30d-c470303ee25b", "address": "fa:16:3e:39:0a:f0", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd369ab38-80", "ovs_interfaceid": "d369ab38-80e4-4adb-b30d-c470303ee25b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2031.358811] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Acquired lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2031.359045] env[68492]: DEBUG nova.network.neutron [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Refreshing network info cache for port d369ab38-80e4-4adb-b30d-c470303ee25b {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2031.360373] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:39:0a:f0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35342bcb-8b06-472e-b3c0-43fd3d6c4b30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd369ab38-80e4-4adb-b30d-c470303ee25b', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2031.367903] env[68492]: DEBUG oslo.service.loopingcall [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2031.368956] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2031.371739] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38f972b5-8798-4d0c-bf3b-e5ae4cdb3e51 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2031.393380] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2031.393380] env[68492]: value = "task-3395570" [ 2031.393380] env[68492]: _type = "Task" [ 2031.393380] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2031.401350] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395570, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2031.612668] env[68492]: DEBUG nova.network.neutron [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Updated VIF entry in instance network info cache for port d369ab38-80e4-4adb-b30d-c470303ee25b. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2031.613043] env[68492]: DEBUG nova.network.neutron [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Updating instance_info_cache with network_info: [{"id": "d369ab38-80e4-4adb-b30d-c470303ee25b", "address": "fa:16:3e:39:0a:f0", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd369ab38-80", "ovs_interfaceid": "d369ab38-80e4-4adb-b30d-c470303ee25b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2031.622049] env[68492]: DEBUG oslo_concurrency.lockutils [req-e7ca540c-5fd4-427c-b12f-4369a6870c74 req-b406bca9-c4c7-41e0-a5ce-b11d50e22d25 service nova] Releasing lock "refresh_cache-696b560c-f4ed-4105-87e9-e5380a468fe1" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2031.903464] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395570, 'name': CreateVM_Task, 'duration_secs': 0.351075} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2031.903678] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2031.904304] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2031.904471] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2031.904784] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2031.905043] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-901ebab2-91e9-412b-a71d-d5f13ae9ed82 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2031.909212] env[68492]: DEBUG oslo_vmware.api [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 2031.909212] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5225dbed-3d35-0760-3e1c-dd7d94f078ba" [ 2031.909212] env[68492]: _type = "Task" [ 2031.909212] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2031.916487] env[68492]: DEBUG oslo_vmware.api [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5225dbed-3d35-0760-3e1c-dd7d94f078ba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2032.230957] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2032.419674] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2032.419914] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2032.420140] env[68492]: DEBUG oslo_concurrency.lockutils [None req-5674ad87-c4a0-42f9-96d6-57ee42a8a323 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2034.210048] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2034.210444] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2034.231025] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2034.231251] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2034.243560] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2034.243756] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2034.243916] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2034.244090] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2034.245174] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc50647d-1780-4de7-a4db-ca681a4bd1c2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.254284] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e95fe3a-b805-4d17-bfc2-2804f61e96b3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.268174] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-134aa976-a837-441b-9f9d-b11e2ab06668 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.275422] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae59f9fe-af0c-4e56-ac61-ac8d6b43de64 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.306616] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2034.306859] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2034.307132] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2034.375985] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376177] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376306] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376434] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376542] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376655] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376768] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376880] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.376991] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.377120] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2034.389237] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 2034.389474] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2034.389702] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2034.405352] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2034.419168] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2034.419377] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2034.430364] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2034.446186] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2034.573158] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e94290-9527-4254-97de-56665dccdcb9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.581594] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db6e4613-22d7-41d3-bff1-8932fd33ef4b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.611262] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14423436-adc4-4293-bc0f-c96cf9e63a33 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.619179] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c5f144c-377f-4f15-b700-1fbaadd44524 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2034.632615] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2034.641558] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2034.656684] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2034.656856] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2035.657323] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.657615] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2036.231461] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.231714] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2037.226466] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.230996] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.231342] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 2041.239446] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.239446] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 2041.250138] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 0 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 2042.231087] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2051.713502] env[68492]: DEBUG oslo_concurrency.lockutils [None req-6295c49f-4282-4ce2-8aee-dd1bbf43e9ad tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2051.777764] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "c472a34d-b388-46c9-a7e0-7106b0666478" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2077.835492] env[68492]: WARNING oslo_vmware.rw_handles [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2077.835492] env[68492]: ERROR oslo_vmware.rw_handles [ 2077.836116] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2077.838080] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2077.838344] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Copying Virtual Disk [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/089323e1-90b5-4e16-ba57-09341131a203/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2077.838641] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-25b82eea-9ba7-42a0-98e0-accbf4c282d4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2077.847314] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for the task: (returnval){ [ 2077.847314] env[68492]: value = "task-3395571" [ 2077.847314] env[68492]: _type = "Task" [ 2077.847314] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2077.854897] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Task: {'id': task-3395571, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2078.359023] env[68492]: DEBUG oslo_vmware.exceptions [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2078.359023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2078.359186] env[68492]: ERROR nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2078.359186] env[68492]: Faults: ['InvalidArgument'] [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Traceback (most recent call last): [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] yield resources [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self.driver.spawn(context, instance, image_meta, [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self._fetch_image_if_missing(context, vi) [ 2078.359186] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] image_cache(vi, tmp_image_ds_loc) [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] vm_util.copy_virtual_disk( [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] session._wait_for_task(vmdk_copy_task) [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return self.wait_for_task(task_ref) [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return evt.wait() [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] result = hub.switch() [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2078.359467] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return self.greenlet.switch() [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self.f(*self.args, **self.kw) [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] raise exceptions.translate_fault(task_info.error) [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Faults: ['InvalidArgument'] [ 2078.359746] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] [ 2078.359746] env[68492]: INFO nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Terminating instance [ 2078.361057] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2078.361196] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2078.361505] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aab01ed1-2fe4-4c7b-b716-0d141db67a83 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.363725] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2078.363916] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2078.364677] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bdbb225-d8ff-4c32-afee-c19dc68ab0cb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.373252] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2078.373474] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c16b4caf-3727-40b7-8a94-a9cc11346fa2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.375595] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2078.375768] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2078.376726] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17c3a79e-178f-4c2a-b484-a1588ee9392d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.381355] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for the task: (returnval){ [ 2078.381355] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525da1d3-062e-23ed-1bd5-2bd24b5d8fee" [ 2078.381355] env[68492]: _type = "Task" [ 2078.381355] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2078.388261] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525da1d3-062e-23ed-1bd5-2bd24b5d8fee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2078.440583] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2078.440826] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2078.441029] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Deleting the datastore file [datastore2] 18e27433-5b1f-4ae8-8bfc-a232966de70b {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2078.441300] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-63c2a2e7-25ad-4c45-a224-f951f2670193 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.447301] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for the task: (returnval){ [ 2078.447301] env[68492]: value = "task-3395573" [ 2078.447301] env[68492]: _type = "Task" [ 2078.447301] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2078.454822] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Task: {'id': task-3395573, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2078.891751] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2078.892091] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Creating directory with path [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2078.892254] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e3c2a92-b50f-42f6-85b5-3740b045f153 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.903167] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Created directory with path [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2078.903352] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Fetch image to [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2078.903520] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2078.904247] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5d3dc40-bd9e-403f-818b-53e5f10a5211 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.910502] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd14e994-b56c-489a-84db-0d66544470af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.919226] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9a86de-3196-4ad0-9f67-c486f90019ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.951124] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-614d8717-5453-4896-b6c8-d26a58a3e160 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.957733] env[68492]: DEBUG oslo_vmware.api [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Task: {'id': task-3395573, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063285} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2078.959166] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2078.959425] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2078.959677] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2078.959919] env[68492]: INFO nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2078.961746] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2316d909-bd7d-4fe9-a8ae-afcec1813739 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2078.963581] env[68492]: DEBUG nova.compute.claims [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2078.963749] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2078.963956] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.983789] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2079.108045] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2079.166678] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2079.166866] env[68492]: DEBUG oslo_vmware.rw_handles [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2079.198216] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5436c135-3ad6-48d0-87e9-f13f93b15425 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.206992] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d33ade36-d042-441d-9566-7f812256f61b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.235483] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71b2cd65-895c-4aad-bf61-6333f5796244 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.242400] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68471f19-c8fc-4bd5-ac06-d58e2743766f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.254974] env[68492]: DEBUG nova.compute.provider_tree [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2079.263753] env[68492]: DEBUG nova.scheduler.client.report [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2079.278111] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2079.278644] env[68492]: ERROR nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2079.278644] env[68492]: Faults: ['InvalidArgument'] [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Traceback (most recent call last): [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self.driver.spawn(context, instance, image_meta, [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self._fetch_image_if_missing(context, vi) [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] image_cache(vi, tmp_image_ds_loc) [ 2079.278644] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] vm_util.copy_virtual_disk( [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] session._wait_for_task(vmdk_copy_task) [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return self.wait_for_task(task_ref) [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return evt.wait() [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] result = hub.switch() [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] return self.greenlet.switch() [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2079.279068] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] self.f(*self.args, **self.kw) [ 2079.279341] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2079.279341] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] raise exceptions.translate_fault(task_info.error) [ 2079.279341] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2079.279341] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Faults: ['InvalidArgument'] [ 2079.279341] env[68492]: ERROR nova.compute.manager [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] [ 2079.279341] env[68492]: DEBUG nova.compute.utils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2079.280697] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Build of instance 18e27433-5b1f-4ae8-8bfc-a232966de70b was re-scheduled: A specified parameter was not correct: fileType [ 2079.280697] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2079.281071] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2079.281241] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2079.281440] env[68492]: DEBUG nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2079.281609] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2079.612226] env[68492]: DEBUG nova.network.neutron [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2079.623451] env[68492]: INFO nova.compute.manager [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Took 0.34 seconds to deallocate network for instance. [ 2079.718369] env[68492]: INFO nova.scheduler.client.report [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Deleted allocations for instance 18e27433-5b1f-4ae8-8bfc-a232966de70b [ 2079.738023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-397e4800-685f-48df-9bf6-fdaa69f8fac9 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 687.849s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2079.739134] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 491.138s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2079.739623] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2079.739623] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2079.739765] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2079.742266] env[68492]: INFO nova.compute.manager [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Terminating instance [ 2079.744284] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquiring lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2079.744284] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Acquired lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2079.744510] env[68492]: DEBUG nova.network.neutron [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2079.752252] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2079.779296] env[68492]: DEBUG nova.network.neutron [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2079.805795] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2079.806063] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2079.807612] env[68492]: INFO nova.compute.claims [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2079.981954] env[68492]: DEBUG nova.network.neutron [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2079.989657] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e09425-c2c9-4c21-8914-23e5c87445e3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2079.993794] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Releasing lock "refresh_cache-18e27433-5b1f-4ae8-8bfc-a232966de70b" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2079.994287] env[68492]: DEBUG nova.compute.manager [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2079.994508] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2079.995369] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-972a4e71-97bb-433e-950d-b5d862617a55 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.000711] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62fb0aa1-2abc-41c4-a9ce-8d73067e7a27 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.006683] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5076e420-1abe-4af9-8386-c710a38b40cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.053079] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6e90230-ebab-4460-9edb-9e7dfa1fbf62 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.055712] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 18e27433-5b1f-4ae8-8bfc-a232966de70b could not be found. [ 2080.055904] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2080.056088] env[68492]: INFO nova.compute.manager [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Took 0.06 seconds to destroy the instance on the hypervisor. [ 2080.056339] env[68492]: DEBUG oslo.service.loopingcall [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2080.056523] env[68492]: DEBUG nova.compute.manager [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2080.056614] env[68492]: DEBUG nova.network.neutron [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2080.062877] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1512dd-38cf-4ac9-8d84-59c4167ce904 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.076960] env[68492]: DEBUG nova.compute.provider_tree [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2080.078411] env[68492]: DEBUG nova.network.neutron [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2080.085355] env[68492]: DEBUG nova.network.neutron [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2080.086637] env[68492]: DEBUG nova.scheduler.client.report [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2080.092298] env[68492]: INFO nova.compute.manager [-] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] Took 0.04 seconds to deallocate network for instance. [ 2080.098064] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2080.098488] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2080.128634] env[68492]: DEBUG nova.compute.utils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2080.129867] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2080.130049] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2080.139951] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2080.177159] env[68492]: DEBUG oslo_concurrency.lockutils [None req-684b5b35-d229-476b-84ec-79810dc89734 tempest-ServerDiskConfigTestJSON-1495871571 tempest-ServerDiskConfigTestJSON-1495871571-project-member] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.438s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2080.178171] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 311.931s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2080.178362] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 18e27433-5b1f-4ae8-8bfc-a232966de70b] During sync_power_state the instance has a pending task (spawning). Skip. [ 2080.178537] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "18e27433-5b1f-4ae8-8bfc-a232966de70b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2080.185644] env[68492]: DEBUG nova.policy [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06d98ba654414d2091d24b5304834776', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bbfde028d2494faca2e128b80c7c6a0d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 2080.206216] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2080.230278] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2080.230507] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2080.230770] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2080.231142] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2080.231375] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2080.231557] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2080.231784] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2080.231981] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2080.232235] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2080.232452] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2080.232626] env[68492]: DEBUG nova.virt.hardware [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2080.233482] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fefa8878-1c1c-40cd-99ee-b22caaab383c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.242156] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-816eff6c-d5ba-4445-906a-b204fce5579e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2080.503721] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Successfully created port: 972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2081.252329] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Successfully updated port: 972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2081.263847] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2081.264023] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2081.264177] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2081.504777] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2081.660504] env[68492]: DEBUG nova.network.neutron [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Updating instance_info_cache with network_info: [{"id": "972d6254-e873-4953-8a51-df6827eb5633", "address": "fa:16:3e:24:77:1f", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap972d6254-e8", "ovs_interfaceid": "972d6254-e873-4953-8a51-df6827eb5633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2081.666051] env[68492]: DEBUG nova.compute.manager [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Received event network-vif-plugged-972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2081.666272] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Acquiring lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2081.666468] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2081.666628] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2081.666792] env[68492]: DEBUG nova.compute.manager [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] No waiting events found dispatching network-vif-plugged-972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2081.666949] env[68492]: WARNING nova.compute.manager [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Received unexpected event network-vif-plugged-972d6254-e873-4953-8a51-df6827eb5633 for instance with vm_state building and task_state spawning. [ 2081.667118] env[68492]: DEBUG nova.compute.manager [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Received event network-changed-972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2081.667268] env[68492]: DEBUG nova.compute.manager [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Refreshing instance network info cache due to event network-changed-972d6254-e873-4953-8a51-df6827eb5633. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2081.667492] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Acquiring lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2081.676455] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2081.676729] env[68492]: DEBUG nova.compute.manager [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Instance network_info: |[{"id": "972d6254-e873-4953-8a51-df6827eb5633", "address": "fa:16:3e:24:77:1f", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap972d6254-e8", "ovs_interfaceid": "972d6254-e873-4953-8a51-df6827eb5633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2081.676991] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Acquired lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2081.677177] env[68492]: DEBUG nova.network.neutron [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Refreshing network info cache for port 972d6254-e873-4953-8a51-df6827eb5633 {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2081.678321] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:77:1f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cd098b1c-636f-492d-b5ae-037cb0cae454', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '972d6254-e873-4953-8a51-df6827eb5633', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2081.685771] env[68492]: DEBUG oslo.service.loopingcall [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2081.688683] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2081.689119] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b7101bd-e7f9-4bfb-a1fa-a6bfbfc0726e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2081.709831] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2081.709831] env[68492]: value = "task-3395574" [ 2081.709831] env[68492]: _type = "Task" [ 2081.709831] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2081.718432] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395574, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2081.965871] env[68492]: DEBUG nova.network.neutron [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Updated VIF entry in instance network info cache for port 972d6254-e873-4953-8a51-df6827eb5633. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2081.966236] env[68492]: DEBUG nova.network.neutron [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Updating instance_info_cache with network_info: [{"id": "972d6254-e873-4953-8a51-df6827eb5633", "address": "fa:16:3e:24:77:1f", "network": {"id": "e36b4b9c-574b-4864-99e8-f1821399aff5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-409731655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bbfde028d2494faca2e128b80c7c6a0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cd098b1c-636f-492d-b5ae-037cb0cae454", "external-id": "nsx-vlan-transportzone-377", "segmentation_id": 377, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap972d6254-e8", "ovs_interfaceid": "972d6254-e873-4953-8a51-df6827eb5633", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2081.975857] env[68492]: DEBUG oslo_concurrency.lockutils [req-6f2a7d1b-8601-457a-9034-d16e4706ee15 req-2cee7835-0ab9-47d0-9dd3-7e9ca47b3913 service nova] Releasing lock "refresh_cache-62a40c52-fae7-4025-b0af-1c2124e4d6f5" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2082.220282] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395574, 'name': CreateVM_Task, 'duration_secs': 0.276589} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2082.220416] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2082.220974] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2082.221160] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2082.221496] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2082.221749] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0b142e24-0d68-485f-acfa-9bca04d9e0f8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2082.225910] env[68492]: DEBUG oslo_vmware.api [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 2082.225910] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52fc23f4-766b-138e-decc-f0fedecdc0ca" [ 2082.225910] env[68492]: _type = "Task" [ 2082.225910] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2082.233072] env[68492]: DEBUG oslo_vmware.api [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52fc23f4-766b-138e-decc-f0fedecdc0ca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2082.736487] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2082.736829] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2082.736950] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c5eacaf4-aceb-4d1b-89a9-84d815345eb0 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2090.239617] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2092.231664] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2092.231989] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2092.231989] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2092.253813] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.253958] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254099] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254224] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254347] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254466] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254585] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254704] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254819] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.254933] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2092.255060] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2092.255520] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2094.231663] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.231306] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.243796] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2095.244090] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2095.244221] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2095.244371] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2095.245499] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d254933d-dcfd-4dde-8d02-9585fcf6e628 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.254211] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea31e9a-515e-40d2-81ef-a72ec19bde25 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.267955] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86cc7ec-5c01-4e1d-822a-a4e0cea619de {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.274046] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-372b068e-51e1-49fa-ae15-ea24eaade6f6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.303185] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180940MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2095.303330] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2095.303515] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2095.375257] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a90e989d-6aef-482f-b767-8dbdd7f29628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.375444] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.375597] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.375723] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.375844] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.375962] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.376093] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.376210] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.376324] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.376438] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2095.376630] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2095.376766] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2095.490296] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbf2d5f7-5ad9-4824-9012-f3df7fe03be4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.497945] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7604d9b3-d85b-4cf0-a0a6-e2771eb2da1c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.527026] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff4f0700-4c28-4285-8e5e-43d5c5b95e4b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.533699] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28e287f-78f9-4a5e-87f3-cc53c9805e34 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2095.546119] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2095.554044] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2095.578037] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2095.578238] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2096.578447] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.578794] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2096.578794] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2097.226564] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.247052] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2099.246457] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2127.855113] env[68492]: WARNING oslo_vmware.rw_handles [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2127.855113] env[68492]: ERROR oslo_vmware.rw_handles [ 2127.855690] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2127.857824] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2127.857926] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Copying Virtual Disk [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/0b37dbfc-7de4-4e28-b1e8-a81a1e71eb58/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2127.858413] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d91c49d2-f429-49fe-af39-1de8ea18a01f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2127.866619] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for the task: (returnval){ [ 2127.866619] env[68492]: value = "task-3395575" [ 2127.866619] env[68492]: _type = "Task" [ 2127.866619] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2127.874469] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Task: {'id': task-3395575, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2128.377482] env[68492]: DEBUG oslo_vmware.exceptions [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2128.377772] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2128.378370] env[68492]: ERROR nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2128.378370] env[68492]: Faults: ['InvalidArgument'] [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Traceback (most recent call last): [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] yield resources [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self.driver.spawn(context, instance, image_meta, [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self._fetch_image_if_missing(context, vi) [ 2128.378370] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] image_cache(vi, tmp_image_ds_loc) [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] vm_util.copy_virtual_disk( [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] session._wait_for_task(vmdk_copy_task) [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return self.wait_for_task(task_ref) [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return evt.wait() [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] result = hub.switch() [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2128.378662] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return self.greenlet.switch() [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self.f(*self.args, **self.kw) [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] raise exceptions.translate_fault(task_info.error) [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Faults: ['InvalidArgument'] [ 2128.378978] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] [ 2128.378978] env[68492]: INFO nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Terminating instance [ 2128.380334] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2128.380549] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2128.380790] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-300464d3-8746-4412-9a04-ebe469f3d1cf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.383235] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2128.383443] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2128.384184] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1744e07-6b4c-4d40-bcfc-25b6f7c59717 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.390978] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2128.392017] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b8a70d75-0e6d-475b-a9b4-e3dcc2175e67 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.393414] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2128.393597] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2128.394273] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd4cc044-4604-493d-8036-409be4a2c9a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.399145] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 2128.399145] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525b0fc1-db72-c695-68d7-262278c42b7d" [ 2128.399145] env[68492]: _type = "Task" [ 2128.399145] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2128.405920] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]525b0fc1-db72-c695-68d7-262278c42b7d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2128.909939] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2128.910309] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating directory with path [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2128.910517] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc94d13c-d91b-4225-85fc-5b82f4f2ec10 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.929992] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Created directory with path [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2128.930231] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Fetch image to [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2128.930374] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2128.931119] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2315e5a-84c1-431c-bdaf-389f9163f0af {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.937668] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da8a2420-f9b1-488e-99f9-91dd37588fbf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.946442] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51ae43d1-9105-47f5-8811-3636c3400011 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.977626] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d952bd3-1a67-4aa5-97fb-2e5badd9d532 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2128.983478] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4a7810e0-7a94-4171-859e-950573a6f26d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2129.006734] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2129.054596] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2129.113989] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2129.114133] env[68492]: DEBUG oslo_vmware.rw_handles [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2129.696871] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2129.697111] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2129.697294] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Deleting the datastore file [datastore2] a90e989d-6aef-482f-b767-8dbdd7f29628 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2129.697575] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e0fd8128-07a9-48a8-ac4e-fc5f10962b67 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2129.703360] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for the task: (returnval){ [ 2129.703360] env[68492]: value = "task-3395577" [ 2129.703360] env[68492]: _type = "Task" [ 2129.703360] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2129.711197] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Task: {'id': task-3395577, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2130.214449] env[68492]: DEBUG oslo_vmware.api [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Task: {'id': task-3395577, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.09183} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2130.214449] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2130.214449] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2130.214449] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2130.214449] env[68492]: INFO nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Took 1.83 seconds to destroy the instance on the hypervisor. [ 2130.216398] env[68492]: DEBUG nova.compute.claims [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2130.216398] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2130.216546] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2130.375048] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b5248b7-b4b9-4301-98c0-57763abab0f3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.382321] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cd1a6b5-16e5-475a-abb7-39c80ed09750 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.411310] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dca4fee9-43ed-44db-a499-0680fc67b9e7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.417889] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b21a1bd1-ed89-44db-9637-f39b1e9e8e69 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.430451] env[68492]: DEBUG nova.compute.provider_tree [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2130.438511] env[68492]: DEBUG nova.scheduler.client.report [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2130.451442] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.235s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2130.451949] env[68492]: ERROR nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2130.451949] env[68492]: Faults: ['InvalidArgument'] [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Traceback (most recent call last): [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self.driver.spawn(context, instance, image_meta, [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self._fetch_image_if_missing(context, vi) [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] image_cache(vi, tmp_image_ds_loc) [ 2130.451949] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] vm_util.copy_virtual_disk( [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] session._wait_for_task(vmdk_copy_task) [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return self.wait_for_task(task_ref) [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return evt.wait() [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] result = hub.switch() [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] return self.greenlet.switch() [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2130.452278] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] self.f(*self.args, **self.kw) [ 2130.452630] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2130.452630] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] raise exceptions.translate_fault(task_info.error) [ 2130.452630] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2130.452630] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Faults: ['InvalidArgument'] [ 2130.452630] env[68492]: ERROR nova.compute.manager [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] [ 2130.452761] env[68492]: DEBUG nova.compute.utils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2130.454031] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Build of instance a90e989d-6aef-482f-b767-8dbdd7f29628 was re-scheduled: A specified parameter was not correct: fileType [ 2130.454031] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2130.454409] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2130.454580] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2130.454746] env[68492]: DEBUG nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2130.454903] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2130.762046] env[68492]: DEBUG nova.network.neutron [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2130.772925] env[68492]: INFO nova.compute.manager [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Took 0.32 seconds to deallocate network for instance. [ 2130.867886] env[68492]: INFO nova.scheduler.client.report [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Deleted allocations for instance a90e989d-6aef-482f-b767-8dbdd7f29628 [ 2130.895629] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bd96a163-d448-437b-b722-31b24a884f81 tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 674.218s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2130.895903] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 478.068s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2130.896145] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Acquiring lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2130.896359] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2130.896525] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2130.898920] env[68492]: INFO nova.compute.manager [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Terminating instance [ 2130.900707] env[68492]: DEBUG nova.compute.manager [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2130.900945] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2130.901477] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d7411932-a26d-471a-8123-22e288382449 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.913057] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83dabe8e-a026-42f5-9f1f-361d02bfc710 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2130.943041] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a90e989d-6aef-482f-b767-8dbdd7f29628 could not be found. [ 2130.943305] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2130.943560] env[68492]: INFO nova.compute.manager [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2130.943855] env[68492]: DEBUG oslo.service.loopingcall [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2130.944158] env[68492]: DEBUG nova.compute.manager [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2130.944290] env[68492]: DEBUG nova.network.neutron [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2130.974012] env[68492]: DEBUG nova.network.neutron [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2130.981592] env[68492]: INFO nova.compute.manager [-] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] Took 0.04 seconds to deallocate network for instance. [ 2131.065816] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0c7b60f3-eabe-4a23-9e7d-d474227f4d1f tempest-ServersTestFqdnHostnames-688777174 tempest-ServersTestFqdnHostnames-688777174-project-member] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2131.066637] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 362.819s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2131.066818] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a90e989d-6aef-482f-b767-8dbdd7f29628] During sync_power_state the instance has a pending task (deleting). Skip. [ 2131.066990] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "a90e989d-6aef-482f-b767-8dbdd7f29628" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2151.231559] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2153.232015] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2154.231618] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2154.231836] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2154.231917] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2154.253063] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253432] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253432] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253562] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253612] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253693] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253823] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.253963] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.254099] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2154.254220] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2155.231078] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.230612] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.230954] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.242397] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2156.242598] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2156.242761] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2156.242962] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2156.244442] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea7259d5-1e7e-48d6-9153-793e2764c18c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.252814] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b61cbce-2b61-4cdc-a9fc-cc51b8d36b8d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.267810] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f6394e-13a7-4363-b2f9-a7687e66b11d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.273980] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b2cc10f-3ce3-431e-b4b4-693148b2910c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.302608] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180935MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2156.302608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2156.302792] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2156.371276] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance aab8759d-db1e-4817-98bf-e1fb45e75640 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.371447] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.371594] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.371714] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.371835] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.371954] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.372083] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.372201] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.372316] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2156.372498] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2156.372633] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2156.476294] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86403613-6adc-4b4d-833e-62957b5acd7c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.484017] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c58d88cd-1f6a-4ea0-8be3-e93d56bf102d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.514316] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfabca40-c7a3-480b-b640-7e2f5db5f937 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.520897] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf38ec2-1ede-4baf-aaa7-0f250f537469 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.533697] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2156.541871] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2156.555085] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2156.555223] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.555886] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.556294] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2158.231770] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2161.226940] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2175.458581] env[68492]: DEBUG oslo_concurrency.lockutils [None req-bad614ee-c1ef-4427-80c6-e933ef57b185 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2178.236445] env[68492]: WARNING oslo_vmware.rw_handles [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2178.236445] env[68492]: ERROR oslo_vmware.rw_handles [ 2178.237091] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2178.239556] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2178.239823] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Copying Virtual Disk [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/6e2eeeb2-ffc0-4cb0-903d-32258ba160c5/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2178.240128] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f314e740-3652-47f1-ab8e-71e4c71a6ece {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.248278] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 2178.248278] env[68492]: value = "task-3395578" [ 2178.248278] env[68492]: _type = "Task" [ 2178.248278] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2178.255880] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395578, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2178.759498] env[68492]: DEBUG oslo_vmware.exceptions [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2178.759777] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2178.760364] env[68492]: ERROR nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2178.760364] env[68492]: Faults: ['InvalidArgument'] [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Traceback (most recent call last): [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] yield resources [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self.driver.spawn(context, instance, image_meta, [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self._fetch_image_if_missing(context, vi) [ 2178.760364] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] image_cache(vi, tmp_image_ds_loc) [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] vm_util.copy_virtual_disk( [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] session._wait_for_task(vmdk_copy_task) [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return self.wait_for_task(task_ref) [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return evt.wait() [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] result = hub.switch() [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2178.760789] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return self.greenlet.switch() [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self.f(*self.args, **self.kw) [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] raise exceptions.translate_fault(task_info.error) [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Faults: ['InvalidArgument'] [ 2178.761191] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] [ 2178.761191] env[68492]: INFO nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Terminating instance [ 2178.762190] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2178.762406] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2178.762644] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f422cbcc-6e18-4383-a549-15dc4bac9579 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.764765] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2178.764944] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2178.765661] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ddaf548-15ba-4db8-aea6-29cddafb27d2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.772150] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2178.772364] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b0b2fc2e-8307-461c-ad3f-efb04ffb6a95 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.774528] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2178.774712] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2178.775654] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aedcbe19-8cc8-4041-88bd-37df47b1ee71 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.780386] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 2178.780386] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527d8722-dc66-cacf-6967-4191497cab48" [ 2178.780386] env[68492]: _type = "Task" [ 2178.780386] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2178.787395] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]527d8722-dc66-cacf-6967-4191497cab48, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2178.836233] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2178.836457] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2178.836640] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleting the datastore file [datastore2] aab8759d-db1e-4817-98bf-e1fb45e75640 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2178.836909] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e4348f14-37bb-4ad6-8d27-285a800cc5da {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2178.843455] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for the task: (returnval){ [ 2178.843455] env[68492]: value = "task-3395580" [ 2178.843455] env[68492]: _type = "Task" [ 2178.843455] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2178.852301] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395580, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2179.291312] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2179.291684] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating directory with path [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2179.291835] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0bb1a3d1-f22a-4a50-920c-4b87c04d7dba {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.302652] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Created directory with path [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2179.302834] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Fetch image to [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2179.302992] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2179.303742] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdedf4c9-b1ab-427b-b36e-5ed45098780e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.310139] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-686d15d7-0fc9-4ea9-93a5-02ddcd164cfa {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.318863] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea2ac620-1c97-4728-b222-665bd8231a1d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.351586] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11a04f50-6c94-427d-b7f2-f488a5241236 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.360052] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-54f76186-5dcd-4bbe-ae54-8f46abca264b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.361675] env[68492]: DEBUG oslo_vmware.api [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Task: {'id': task-3395580, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069601} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2179.361903] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2179.362094] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2179.362266] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2179.362436] env[68492]: INFO nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2179.364581] env[68492]: DEBUG nova.compute.claims [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2179.364757] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2179.364976] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2179.382010] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2179.430346] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2179.489494] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2179.489701] env[68492]: DEBUG oslo_vmware.rw_handles [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2179.576515] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e340cb-e989-486c-a06e-869563dc308c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.583902] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f8e0d38-97bf-479a-996e-91bc733d3001 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.613694] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bb9b429-0472-4c73-ab67-415aaa42a1f5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.620543] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4305dce3-fc4a-4fc6-b281-72498226422e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2179.633153] env[68492]: DEBUG nova.compute.provider_tree [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2179.641604] env[68492]: DEBUG nova.scheduler.client.report [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2179.656478] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2179.656998] env[68492]: ERROR nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2179.656998] env[68492]: Faults: ['InvalidArgument'] [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Traceback (most recent call last): [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self.driver.spawn(context, instance, image_meta, [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self._fetch_image_if_missing(context, vi) [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] image_cache(vi, tmp_image_ds_loc) [ 2179.656998] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] vm_util.copy_virtual_disk( [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] session._wait_for_task(vmdk_copy_task) [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return self.wait_for_task(task_ref) [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return evt.wait() [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] result = hub.switch() [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] return self.greenlet.switch() [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2179.657309] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] self.f(*self.args, **self.kw) [ 2179.658276] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2179.658276] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] raise exceptions.translate_fault(task_info.error) [ 2179.658276] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2179.658276] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Faults: ['InvalidArgument'] [ 2179.658276] env[68492]: ERROR nova.compute.manager [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] [ 2179.658276] env[68492]: DEBUG nova.compute.utils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2179.659033] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Build of instance aab8759d-db1e-4817-98bf-e1fb45e75640 was re-scheduled: A specified parameter was not correct: fileType [ 2179.659033] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2179.659405] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2179.659577] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2179.659744] env[68492]: DEBUG nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2179.659915] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2179.988712] env[68492]: DEBUG nova.network.neutron [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2180.000951] env[68492]: INFO nova.compute.manager [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Took 0.34 seconds to deallocate network for instance. [ 2180.107224] env[68492]: INFO nova.scheduler.client.report [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Deleted allocations for instance aab8759d-db1e-4817-98bf-e1fb45e75640 [ 2180.134653] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4ea19299-efe6-41eb-b2f3-2dc2a986e7a3 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 593.005s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2180.134907] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 411.887s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2180.135103] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] During sync_power_state the instance has a pending task (spawning). Skip. [ 2180.135274] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2180.135783] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 397.199s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2180.136036] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Acquiring lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2180.136258] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2180.136429] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2180.140287] env[68492]: INFO nova.compute.manager [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Terminating instance [ 2180.142215] env[68492]: DEBUG nova.compute.manager [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2180.142426] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2180.142925] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c492994f-6b35-41a8-9f97-116f5759dd80 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.152270] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b798013c-f6fd-4bfe-9b63-dc8e5dc567b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2180.179501] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aab8759d-db1e-4817-98bf-e1fb45e75640 could not be found. [ 2180.179700] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2180.179874] env[68492]: INFO nova.compute.manager [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2180.180157] env[68492]: DEBUG oslo.service.loopingcall [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2180.180372] env[68492]: DEBUG nova.compute.manager [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2180.180469] env[68492]: DEBUG nova.network.neutron [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2180.201702] env[68492]: DEBUG nova.network.neutron [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2180.209236] env[68492]: INFO nova.compute.manager [-] [instance: aab8759d-db1e-4817-98bf-e1fb45e75640] Took 0.03 seconds to deallocate network for instance. [ 2180.298011] env[68492]: DEBUG oslo_concurrency.lockutils [None req-4b7301db-a856-45b3-9b68-cddf225f4484 tempest-DeleteServersTestJSON-1420200429 tempest-DeleteServersTestJSON-1420200429-project-member] Lock "aab8759d-db1e-4817-98bf-e1fb45e75640" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2181.092255] env[68492]: DEBUG oslo_concurrency.lockutils [None req-9dad1ea8-9657-481a-bcaa-d0bc95720515 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "696b560c-f4ed-4105-87e9-e5380a468fe1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2212.232198] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2214.232054] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2214.232419] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2214.232467] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2214.254107] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.254357] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.254581] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.254800] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.255024] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.255245] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.255478] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.255694] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2214.255900] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2215.231165] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.231455] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.241637] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2216.241866] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.242045] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2216.242204] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2216.243662] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e310e8-a354-4566-a8d6-abf5cceb9971 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.252329] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d7d53c-22e6-42d8-bb83-04d27019815a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.266208] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7774dcc0-6eb5-4719-9734-4ca7097ac152 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.272335] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaf34d4b-d377-4795-91ef-347aff793584 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.301848] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180919MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2216.302051] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2216.302206] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.388807] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance e6c9ab71-8507-4238-9936-fd9a61101313 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.388976] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389132] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389263] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389383] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389502] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389620] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389737] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2216.389922] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2216.390089] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2216.482630] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-859b62e4-0c23-4b8f-8d1c-80695ae00d77 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.489952] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07769969-e736-418b-8e2f-a3c1ecd2b461 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.519180] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5622820d-7b19-4e13-b236-c1d6f1167a34 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.525788] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0296656b-5af3-4eb7-ab37-7a83ff682ac6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.539298] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2216.547763] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2216.562264] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2216.562439] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.562830] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2217.563146] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.226428] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.247638] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.247833] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2220.232484] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.226658] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.551234] env[68492]: WARNING oslo_vmware.rw_handles [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2227.551234] env[68492]: ERROR oslo_vmware.rw_handles [ 2227.551846] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2227.553967] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2227.554278] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Copying Virtual Disk [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/f834166b-cdac-4891-8ef6-7a0d66a2bf3a/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2227.554590] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-52161629-6982-4831-9c1a-4ae0736ccae2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.563296] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 2227.563296] env[68492]: value = "task-3395581" [ 2227.563296] env[68492]: _type = "Task" [ 2227.563296] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2227.571148] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395581, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2228.073971] env[68492]: DEBUG oslo_vmware.exceptions [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2228.076719] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2228.076719] env[68492]: ERROR nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2228.076719] env[68492]: Faults: ['InvalidArgument'] [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Traceback (most recent call last): [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] yield resources [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self.driver.spawn(context, instance, image_meta, [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2228.076719] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self._fetch_image_if_missing(context, vi) [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] image_cache(vi, tmp_image_ds_loc) [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] vm_util.copy_virtual_disk( [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] session._wait_for_task(vmdk_copy_task) [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return self.wait_for_task(task_ref) [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return evt.wait() [ 2228.077067] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] result = hub.switch() [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return self.greenlet.switch() [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self.f(*self.args, **self.kw) [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] raise exceptions.translate_fault(task_info.error) [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Faults: ['InvalidArgument'] [ 2228.077382] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] [ 2228.077382] env[68492]: INFO nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Terminating instance [ 2228.077642] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2228.077642] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2228.077642] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5d495fca-9925-4644-a486-37f590751684 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.080863] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2228.081072] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2228.081778] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f55b061a-f358-4bbd-ae8f-57ada5d7f78b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.088382] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2228.088575] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dcb1f04b-cb70-4a07-aaef-61a403687b9f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.090598] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2228.090771] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2228.091761] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f95a2da-8d4d-42ba-a179-f10759f5462b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.096497] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for the task: (returnval){ [ 2228.096497] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52032969-ebf9-6879-d8cf-7d742ffb931a" [ 2228.096497] env[68492]: _type = "Task" [ 2228.096497] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2228.103553] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]52032969-ebf9-6879-d8cf-7d742ffb931a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2228.170112] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2228.170352] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2228.170528] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleting the datastore file [datastore2] e6c9ab71-8507-4238-9936-fd9a61101313 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2228.170790] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6c04b9e9-9363-4f27-b79e-020e52a592f6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.176958] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 2228.176958] env[68492]: value = "task-3395583" [ 2228.176958] env[68492]: _type = "Task" [ 2228.176958] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2228.184313] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395583, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2228.607590] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2228.607949] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Creating directory with path [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2228.608108] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-02a27484-761a-42de-84e4-1960d4e90339 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.619681] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Created directory with path [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2228.619865] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Fetch image to [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2228.620039] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2228.620765] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e63a072-0c92-43c8-ac6d-06ae503ddb8e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.627658] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-327b9eb3-b9fc-46fc-9d68-b106f591b0b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.636484] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d1e194-1964-4ed1-b5b8-902d8277f6ff {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.667274] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1bb5522-0d7f-41a3-bd36-4f2867d9ea64 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.672682] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5aae42e5-2023-4584-8d87-3eeccd7f7e40 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.684464] env[68492]: DEBUG oslo_vmware.api [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Task: {'id': task-3395583, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079204} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2228.684683] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2228.684865] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2228.685044] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2228.685222] env[68492]: INFO nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2228.687260] env[68492]: DEBUG nova.compute.claims [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2228.687432] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2228.687640] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2228.694026] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2228.767988] env[68492]: DEBUG oslo_vmware.rw_handles [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2228.828765] env[68492]: DEBUG oslo_vmware.rw_handles [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2228.828765] env[68492]: DEBUG oslo_vmware.rw_handles [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2228.893613] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c95acca-7d28-4aec-b31e-57c16b362761 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.901803] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2220c3c8-9e89-4d9e-8689-006b1694aae8 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.931418] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948c364d-a4d2-47f0-ac4a-fde5d4eab718 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.938834] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c28407ac-4c22-4c08-a54e-6a9d9ffcbbdd {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2228.953867] env[68492]: DEBUG nova.compute.provider_tree [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2228.963124] env[68492]: DEBUG nova.scheduler.client.report [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2228.979241] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2228.979768] env[68492]: ERROR nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2228.979768] env[68492]: Faults: ['InvalidArgument'] [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Traceback (most recent call last): [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self.driver.spawn(context, instance, image_meta, [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self._fetch_image_if_missing(context, vi) [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] image_cache(vi, tmp_image_ds_loc) [ 2228.979768] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] vm_util.copy_virtual_disk( [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] session._wait_for_task(vmdk_copy_task) [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return self.wait_for_task(task_ref) [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return evt.wait() [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] result = hub.switch() [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] return self.greenlet.switch() [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2228.980093] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] self.f(*self.args, **self.kw) [ 2228.980391] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2228.980391] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] raise exceptions.translate_fault(task_info.error) [ 2228.980391] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2228.980391] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Faults: ['InvalidArgument'] [ 2228.980391] env[68492]: ERROR nova.compute.manager [instance: e6c9ab71-8507-4238-9936-fd9a61101313] [ 2228.980520] env[68492]: DEBUG nova.compute.utils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2228.981987] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Build of instance e6c9ab71-8507-4238-9936-fd9a61101313 was re-scheduled: A specified parameter was not correct: fileType [ 2228.981987] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2228.982373] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2228.982544] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2228.982726] env[68492]: DEBUG nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2228.982907] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2229.461357] env[68492]: DEBUG nova.network.neutron [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2229.472054] env[68492]: INFO nova.compute.manager [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Took 0.49 seconds to deallocate network for instance. [ 2229.561118] env[68492]: INFO nova.scheduler.client.report [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Deleted allocations for instance e6c9ab71-8507-4238-9936-fd9a61101313 [ 2229.586706] env[68492]: DEBUG oslo_concurrency.lockutils [None req-d058074c-2b20-459c-bdb0-e4bd09c0bc43 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.051s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2229.586992] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 461.339s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.587257] env[68492]: INFO nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] During sync_power_state the instance has a pending task (spawning). Skip. [ 2229.587864] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2229.588065] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.889s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.588358] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2229.588564] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.588769] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2229.590760] env[68492]: INFO nova.compute.manager [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Terminating instance [ 2229.592997] env[68492]: DEBUG nova.compute.manager [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2229.593200] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2229.593476] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ee295bee-c52f-48b0-9958-935149ce5a6a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.602950] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e855847e-98c7-4cf1-92ca-3af9a0b51523 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.631440] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e6c9ab71-8507-4238-9936-fd9a61101313 could not be found. [ 2229.631722] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2229.631877] env[68492]: INFO nova.compute.manager [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2229.632104] env[68492]: DEBUG oslo.service.loopingcall [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2229.632324] env[68492]: DEBUG nova.compute.manager [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2229.632449] env[68492]: DEBUG nova.network.neutron [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2229.657817] env[68492]: DEBUG nova.network.neutron [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2229.666176] env[68492]: INFO nova.compute.manager [-] [instance: e6c9ab71-8507-4238-9936-fd9a61101313] Took 0.03 seconds to deallocate network for instance. [ 2229.759767] env[68492]: DEBUG oslo_concurrency.lockutils [None req-0b9bd100-b721-4ee4-af70-282cc22d82c0 tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "e6c9ab71-8507-4238-9936-fd9a61101313" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2230.579165] env[68492]: DEBUG oslo_concurrency.lockutils [None req-3521c4d0-118e-4a4b-b03e-dd67782c0ddf tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2272.233809] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.233161] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.233161] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2276.233456] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2276.254996] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255200] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255339] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255478] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255601] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255723] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255844] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2276.255965] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2276.256506] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.580687] env[68492]: WARNING oslo_vmware.rw_handles [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2276.580687] env[68492]: ERROR oslo_vmware.rw_handles [ 2276.581254] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2276.583370] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2276.583640] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Copying Virtual Disk [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/3f32ee18-75b3-4a91-b0aa-5c8dc6670d5c/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2276.583969] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bcb79526-529e-4e14-9fd2-5a4df48ae013 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2276.591833] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for the task: (returnval){ [ 2276.591833] env[68492]: value = "task-3395584" [ 2276.591833] env[68492]: _type = "Task" [ 2276.591833] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2276.600039] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Task: {'id': task-3395584, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2277.102463] env[68492]: DEBUG oslo_vmware.exceptions [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2277.102716] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2277.103291] env[68492]: ERROR nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2277.103291] env[68492]: Faults: ['InvalidArgument'] [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Traceback (most recent call last): [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] yield resources [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self.driver.spawn(context, instance, image_meta, [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self._fetch_image_if_missing(context, vi) [ 2277.103291] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] image_cache(vi, tmp_image_ds_loc) [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] vm_util.copy_virtual_disk( [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] session._wait_for_task(vmdk_copy_task) [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return self.wait_for_task(task_ref) [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return evt.wait() [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] result = hub.switch() [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2277.103897] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return self.greenlet.switch() [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self.f(*self.args, **self.kw) [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] raise exceptions.translate_fault(task_info.error) [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Faults: ['InvalidArgument'] [ 2277.104464] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] [ 2277.104464] env[68492]: INFO nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Terminating instance [ 2277.105133] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2277.105331] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2277.105571] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e8dd06e8-01a0-42ed-9d68-6e90c7f12661 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.107682] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2277.107876] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2277.108591] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c884ccf-1705-4c9d-af56-7761f2b1feaf {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.115351] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2277.115553] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-09c0115d-5323-4143-98ec-325d613e3f1b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.117611] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2277.117791] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2277.118704] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fc41fb2-0cc3-41e4-a9cb-19f0c9d53c2f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.123154] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 2277.123154] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5226f3e1-8973-0ada-0bbb-427f806e51ec" [ 2277.123154] env[68492]: _type = "Task" [ 2277.123154] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2277.130220] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5226f3e1-8973-0ada-0bbb-427f806e51ec, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2277.181521] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2277.181735] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2277.181916] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Deleting the datastore file [datastore2] 610e0ba9-49f1-45b7-9dea-08945d1d56b9 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2277.182191] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d47dac2a-e111-46b3-a149-778bf9eceb94 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.188168] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for the task: (returnval){ [ 2277.188168] env[68492]: value = "task-3395586" [ 2277.188168] env[68492]: _type = "Task" [ 2277.188168] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2277.195376] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Task: {'id': task-3395586, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2277.230897] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2277.633148] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2277.633484] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating directory with path [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2277.633610] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-211854fc-737e-4795-855b-cf1954640f51 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.644254] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Created directory with path [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2277.644445] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Fetch image to [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2277.644613] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2277.645374] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64d4606e-6eea-41f2-8139-dfbe8ee06b38 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.651944] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9da7de7f-8a02-4e31-a889-c07529ec5149 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.661646] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-528c0286-146f-45da-9ab9-632f57a8224d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.694536] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd871ef-c905-4743-a124-a7d8a9d1743a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.701372] env[68492]: DEBUG oslo_vmware.api [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Task: {'id': task-3395586, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074723} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2277.702782] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2277.702977] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2277.703207] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2277.703392] env[68492]: INFO nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2277.705196] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cab11a24-3067-4e09-a7e3-6d5fee5fe640 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.707170] env[68492]: DEBUG nova.compute.claims [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2277.707370] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2277.707609] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2277.730917] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2277.780519] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2277.841134] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2277.841337] env[68492]: DEBUG oslo_vmware.rw_handles [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2277.904311] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c658d8d-1811-4532-ae3a-32c935ea68e2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.911492] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1b880fb-e1be-42d8-9a4c-5a7b8c1bd01e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.942046] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7f4fec0-300d-422e-a205-e60e0aab525b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.949236] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa76231-35d2-49bb-ab65-7f0018f2db00 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2277.961990] env[68492]: DEBUG nova.compute.provider_tree [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2277.970250] env[68492]: DEBUG nova.scheduler.client.report [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2277.985736] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2277.985862] env[68492]: ERROR nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2277.985862] env[68492]: Faults: ['InvalidArgument'] [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Traceback (most recent call last): [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self.driver.spawn(context, instance, image_meta, [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self._fetch_image_if_missing(context, vi) [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] image_cache(vi, tmp_image_ds_loc) [ 2277.985862] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] vm_util.copy_virtual_disk( [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] session._wait_for_task(vmdk_copy_task) [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return self.wait_for_task(task_ref) [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return evt.wait() [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] result = hub.switch() [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] return self.greenlet.switch() [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2277.986179] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] self.f(*self.args, **self.kw) [ 2277.986479] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2277.986479] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] raise exceptions.translate_fault(task_info.error) [ 2277.986479] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2277.986479] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Faults: ['InvalidArgument'] [ 2277.986479] env[68492]: ERROR nova.compute.manager [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] [ 2277.986640] env[68492]: DEBUG nova.compute.utils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2277.988032] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Build of instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 was re-scheduled: A specified parameter was not correct: fileType [ 2277.988032] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2277.988403] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2277.988577] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2277.988744] env[68492]: DEBUG nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2277.988904] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2278.230625] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.230815] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2278.230926] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.241647] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2278.241864] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2278.242048] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2278.242208] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2278.243326] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1077038-cc64-44c3-91fe-da216545c10a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.251860] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ca5dcac-f084-4713-80a5-5579bae3da8f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.265664] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729e9cfa-3be5-4fe6-95f4-0b6ea9b7b686 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.271869] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-496fa76f-1f65-4e67-83a8-a14e3151e903 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.299787] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180899MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2278.299922] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2278.300129] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2278.365691] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1761}} [ 2278.365857] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance a9111481-6ba1-4d76-bce9-8db609eb704d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.365986] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.366129] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.366270] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.366400] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.366518] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2278.366699] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2278.366839] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2278.453547] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73234fcb-af17-46d6-a9a3-d7009a6ff364 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.461983] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01a90f06-dc1e-43a4-96d1-8e4ba3b06d60 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.491694] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a81b8cce-7964-491a-abf2-6c4672c3dd98 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.498261] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3123e47-3ff4-4ebb-9012-f35bb38aeedb {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.511159] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2278.521664] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2278.545814] env[68492]: DEBUG nova.network.neutron [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2278.547243] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2278.547243] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2278.557277] env[68492]: INFO nova.compute.manager [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Took 0.57 seconds to deallocate network for instance. [ 2278.668689] env[68492]: INFO nova.scheduler.client.report [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Deleted allocations for instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 [ 2278.696894] env[68492]: DEBUG oslo_concurrency.lockutils [None req-97e9f015-2da0-4edd-8516-65c266e3d69e tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 647.174s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2278.697260] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 452.049s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2278.697608] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Acquiring lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2278.697803] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2278.697982] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2278.701551] env[68492]: INFO nova.compute.manager [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Terminating instance [ 2278.703508] env[68492]: DEBUG nova.compute.manager [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2278.703739] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2278.704373] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4dfe8df3-7d83-4389-a3ed-c4238e63dd91 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2278.714491] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7071c4ee-b41c-4a3f-84e4-8d41a9b8f9c4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2279.478813] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 610e0ba9-49f1-45b7-9dea-08945d1d56b9 could not be found. [ 2279.479089] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2279.479218] env[68492]: INFO nova.compute.manager [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Took 0.78 seconds to destroy the instance on the hypervisor. [ 2279.479468] env[68492]: DEBUG oslo.service.loopingcall [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2279.479685] env[68492]: DEBUG nova.compute.manager [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2279.479781] env[68492]: DEBUG nova.network.neutron [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2279.503767] env[68492]: DEBUG nova.network.neutron [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2279.511481] env[68492]: INFO nova.compute.manager [-] [instance: 610e0ba9-49f1-45b7-9dea-08945d1d56b9] Took 0.03 seconds to deallocate network for instance. [ 2279.547346] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.595265] env[68492]: DEBUG oslo_concurrency.lockutils [None req-db8a1fa7-12fb-4c27-8aec-d498bd67e161 tempest-ServersNegativeTestMultiTenantJSON-2113462330 tempest-ServersNegativeTestMultiTenantJSON-2113462330-project-member] Lock "610e0ba9-49f1-45b7-9dea-08945d1d56b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.898s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2282.231737] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2283.227500] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2325.943054] env[68492]: WARNING oslo_vmware.rw_handles [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2325.943054] env[68492]: ERROR oslo_vmware.rw_handles [ 2325.943054] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2325.945108] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2325.945359] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Copying Virtual Disk [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/f930e5e2-0b05-41f2-94e0-71c2786c517a/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2325.945661] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-38bef4d7-e7bf-4cee-ba39-20631664e1e9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2325.953366] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 2325.953366] env[68492]: value = "task-3395587" [ 2325.953366] env[68492]: _type = "Task" [ 2325.953366] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2325.961536] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395587, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2326.464072] env[68492]: DEBUG oslo_vmware.exceptions [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2326.464325] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2326.464896] env[68492]: ERROR nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2326.464896] env[68492]: Faults: ['InvalidArgument'] [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Traceback (most recent call last): [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] yield resources [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self.driver.spawn(context, instance, image_meta, [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self._fetch_image_if_missing(context, vi) [ 2326.464896] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] image_cache(vi, tmp_image_ds_loc) [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] vm_util.copy_virtual_disk( [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] session._wait_for_task(vmdk_copy_task) [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return self.wait_for_task(task_ref) [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return evt.wait() [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] result = hub.switch() [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2326.465348] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return self.greenlet.switch() [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self.f(*self.args, **self.kw) [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] raise exceptions.translate_fault(task_info.error) [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Faults: ['InvalidArgument'] [ 2326.465673] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] [ 2326.465673] env[68492]: INFO nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Terminating instance [ 2326.466808] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2326.467053] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2326.467289] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a68b6c4e-f368-44bb-a827-4fd8f7101b3a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.469546] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2326.469738] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2326.470475] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8df3524-1121-4298-9ccd-a59ad27f1507 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.477417] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2326.477627] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d8c31397-6e5c-4296-ad41-ffbff30dd577 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.479663] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2326.479838] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2326.480771] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-68b28c99-f581-4a6d-9a96-71fb301d1b31 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.485780] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 2326.485780] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5218cd6e-d2a9-0ce1-08c6-6d1744a8deb1" [ 2326.485780] env[68492]: _type = "Task" [ 2326.485780] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2326.493856] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5218cd6e-d2a9-0ce1-08c6-6d1744a8deb1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2326.547149] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2326.547375] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2326.547558] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleting the datastore file [datastore2] a9111481-6ba1-4d76-bce9-8db609eb704d {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2326.547824] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e16e7325-83cd-49fd-8e07-af375bfefd47 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2326.553793] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for the task: (returnval){ [ 2326.553793] env[68492]: value = "task-3395589" [ 2326.553793] env[68492]: _type = "Task" [ 2326.553793] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2326.561830] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395589, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2326.996151] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2326.996481] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating directory with path [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2326.996636] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-811a660e-1673-45ec-85c2-4271fe2605f9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.010950] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created directory with path [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2327.011151] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Fetch image to [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2327.011321] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2327.011999] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e81e8d8d-4443-417d-8d4d-4ba50ee6aede {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.018243] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd9e1a5-ab34-4406-bc44-817a722c36bc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.026929] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3942805a-ea64-4609-8459-e72525a5c66f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.058940] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d4c1d5f-1348-471f-a50d-07207a0f427c {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.065732] env[68492]: DEBUG oslo_vmware.api [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Task: {'id': task-3395589, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.09115} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2327.067106] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2327.067300] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2327.067469] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2327.067637] env[68492]: INFO nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2327.069369] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ddd47fa5-e351-4fbc-85ee-685660691062 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.071194] env[68492]: DEBUG nova.compute.claims [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2327.071368] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2327.071585] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2327.091715] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2327.144454] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2327.204497] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2327.204696] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2327.255314] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d7fb06d-3372-4dab-8e78-f965cd20276a {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.263138] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12cfbc1b-128c-4ac9-b3a8-371af2e579ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.292790] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-276c74d6-7308-4b88-b8af-6ea7150b477f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.300358] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb904dea-b814-4ecd-b398-4e2fbac8f74e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.313451] env[68492]: DEBUG nova.compute.provider_tree [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2327.323177] env[68492]: DEBUG nova.scheduler.client.report [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2327.335377] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.264s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2327.335912] env[68492]: ERROR nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2327.335912] env[68492]: Faults: ['InvalidArgument'] [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Traceback (most recent call last): [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self.driver.spawn(context, instance, image_meta, [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self._fetch_image_if_missing(context, vi) [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] image_cache(vi, tmp_image_ds_loc) [ 2327.335912] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] vm_util.copy_virtual_disk( [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] session._wait_for_task(vmdk_copy_task) [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return self.wait_for_task(task_ref) [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return evt.wait() [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] result = hub.switch() [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] return self.greenlet.switch() [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2327.336315] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] self.f(*self.args, **self.kw) [ 2327.336754] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2327.336754] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] raise exceptions.translate_fault(task_info.error) [ 2327.336754] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2327.336754] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Faults: ['InvalidArgument'] [ 2327.336754] env[68492]: ERROR nova.compute.manager [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] [ 2327.336754] env[68492]: DEBUG nova.compute.utils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2327.338167] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Build of instance a9111481-6ba1-4d76-bce9-8db609eb704d was re-scheduled: A specified parameter was not correct: fileType [ 2327.338167] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2327.338537] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2327.338705] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2327.338870] env[68492]: DEBUG nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2327.339042] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2327.605192] env[68492]: DEBUG nova.network.neutron [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2327.647780] env[68492]: INFO nova.compute.manager [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Took 0.31 seconds to deallocate network for instance. [ 2327.755462] env[68492]: INFO nova.scheduler.client.report [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Deleted allocations for instance a9111481-6ba1-4d76-bce9-8db609eb704d [ 2327.777877] env[68492]: DEBUG oslo_concurrency.lockutils [None req-a518733a-05ec-4767-a800-d41378305bd1 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 687.127s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2327.778161] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 491.796s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2327.778382] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Acquiring lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2327.778585] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2327.778776] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2327.780625] env[68492]: INFO nova.compute.manager [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Terminating instance [ 2327.782339] env[68492]: DEBUG nova.compute.manager [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2327.782573] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2327.783343] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9100ce49-c476-415a-b213-97dee3e3c5a9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.793427] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f757c702-8d82-4e7f-9527-bc8bce6bad0f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2327.822496] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a9111481-6ba1-4d76-bce9-8db609eb704d could not be found. [ 2327.822737] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2327.822952] env[68492]: INFO nova.compute.manager [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2327.823242] env[68492]: DEBUG oslo.service.loopingcall [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2327.823471] env[68492]: DEBUG nova.compute.manager [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2327.823567] env[68492]: DEBUG nova.network.neutron [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2327.864207] env[68492]: DEBUG nova.network.neutron [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2327.871985] env[68492]: INFO nova.compute.manager [-] [instance: a9111481-6ba1-4d76-bce9-8db609eb704d] Took 0.05 seconds to deallocate network for instance. [ 2327.957867] env[68492]: DEBUG oslo_concurrency.lockutils [None req-1fce4ab1-fc98-4efc-89d6-6d75c55aa0b4 tempest-AttachVolumeNegativeTest-1796620246 tempest-AttachVolumeNegativeTest-1796620246-project-member] Lock "a9111481-6ba1-4d76-bce9-8db609eb704d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2333.231881] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.230984] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.231362] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Starting heal instance info cache {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9917}} [ 2336.231362] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Rebuilding the list of instances to heal {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9921}} [ 2336.247581] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2336.247725] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2336.247849] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 75bbcae2-54ab-47d2-9bf8-b55b0881fb90] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2336.247973] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 696b560c-f4ed-4105-87e9-e5380a468fe1] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2336.248112] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] [instance: 62a40c52-fae7-4025-b0af-1c2124e4d6f5] Skipping network cache update for instance because it is Building. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9930}} [ 2336.248294] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Didn't find any instances for network info cache update. {{(pid=68492) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:10003}} [ 2336.248737] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2338.231026] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2338.231354] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2338.231468] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68492) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10536}} [ 2340.227432] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.243241] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.243399] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2340.254659] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2340.254898] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2340.255081] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2340.255237] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68492) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2340.256336] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e522e5-0326-4c2d-bdd7-4d7566221ced {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.265251] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6219b832-d5a4-44e5-aaf6-b547bb47e3a2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.278841] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-912b001d-1603-4e9d-a5a1-14b716308bf6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.285074] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6fd2f86-9573-4552-850e-8acea30a630b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.314302] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=102GB free_vcpus=48 pci_devices=None {{(pid=68492) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2340.314447] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2340.314630] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2340.414435] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance c472a34d-b388-46c9-a7e0-7106b0666478 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2340.414595] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance ffddeec8-4442-413c-a0a0-2cf2b110cf14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2340.414722] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2340.414841] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 696b560c-f4ed-4105-87e9-e5380a468fe1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2340.414960] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Instance 62a40c52-fae7-4025-b0af-1c2124e4d6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68492) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1704}} [ 2340.415170] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2340.415311] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=68492) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2340.430874] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing inventories for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2340.443892] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating ProviderTree inventory for provider dba0d66f-84ca-40a4-90ee-609cf684af11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2340.444095] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Updating inventory in ProviderTree for provider dba0d66f-84ca-40a4-90ee-609cf684af11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2340.454745] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing aggregate associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, aggregates: None {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2340.473523] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Refreshing trait associations for resource provider dba0d66f-84ca-40a4-90ee-609cf684af11, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=68492) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2340.537328] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1685bc8b-e6d0-4795-b455-7d0213a58e08 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.544846] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd8c7305-7bcc-44d5-b007-8c1ac58f1e8b {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.573750] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2632a93-0669-4521-b044-73bbbc4ab2f1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.580728] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-733f03dd-e99f-4de4-8288-62a6607f69d4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2340.593368] env[68492]: DEBUG nova.compute.provider_tree [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2340.601370] env[68492]: DEBUG nova.scheduler.client.report [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2340.621060] env[68492]: DEBUG nova.compute.resource_tracker [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68492) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2340.621060] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2343.232050] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.232050] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.232469] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.232469] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11204}} [ 2343.242871] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] There are 0 instances to clean {{(pid=68492) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11213}} [ 2345.231483] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.231828] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Cleaning up deleted instances with incomplete migration {{(pid=68492) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11242}} [ 2356.232053] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2360.245726] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2360.246046] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 2360.246046] env[68492]: value = "domain-c8" [ 2360.246046] env[68492]: _type = "ClusterComputeResource" [ 2360.246046] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2360.247171] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e64caf7-67d8-4d4b-9a43-dfba460749db {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.260980] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 5 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2375.960661] env[68492]: WARNING oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles response.begin() [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2375.960661] env[68492]: ERROR oslo_vmware.rw_handles [ 2375.961458] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Downloaded image file data 595bda25-3485-4d7e-9f66-50f61186cadc to vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2375.963294] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Caching image {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2375.963631] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Copying Virtual Disk [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk to [datastore2] vmware_temp/efe04c1e-5529-4dbf-abc4-6b91595c6e5a/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk {{(pid=68492) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2375.963991] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4643e487-6cb7-438c-9464-6bb8ceca6a9e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2375.972801] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 2375.972801] env[68492]: value = "task-3395590" [ 2375.972801] env[68492]: _type = "Task" [ 2375.972801] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2375.980606] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': task-3395590, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2376.483327] env[68492]: DEBUG oslo_vmware.exceptions [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Fault InvalidArgument not matched. {{(pid=68492) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2376.483613] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2376.484207] env[68492]: ERROR nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2376.484207] env[68492]: Faults: ['InvalidArgument'] [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Traceback (most recent call last): [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] yield resources [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self.driver.spawn(context, instance, image_meta, [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self._fetch_image_if_missing(context, vi) [ 2376.484207] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] image_cache(vi, tmp_image_ds_loc) [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] vm_util.copy_virtual_disk( [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] session._wait_for_task(vmdk_copy_task) [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return self.wait_for_task(task_ref) [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return evt.wait() [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] result = hub.switch() [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2376.484520] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return self.greenlet.switch() [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self.f(*self.args, **self.kw) [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] raise exceptions.translate_fault(task_info.error) [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Faults: ['InvalidArgument'] [ 2376.484917] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] [ 2376.484917] env[68492]: INFO nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Terminating instance [ 2376.486107] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2376.486849] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2376.486849] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d8f7fc1-8565-4dda-b93e-9e84c16e75e9 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.488947] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2376.489148] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2376.489862] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c006b175-ba59-460e-90de-7692a18ddfa2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.497865] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Unregistering the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2376.498923] env[68492]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-22fa8f89-c149-443a-880d-5e42f2ff648d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.500351] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2376.500526] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68492) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2376.501261] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5dd0411e-45ef-49d5-80d0-d316405d41b7 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.507517] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 2376.507517] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528152f6-1343-2701-519f-195890701a2e" [ 2376.507517] env[68492]: _type = "Task" [ 2376.507517] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2376.515087] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]528152f6-1343-2701-519f-195890701a2e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2376.577769] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Unregistered the VM {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2376.577985] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Deleting contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2376.578181] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Deleting the datastore file [datastore2] c472a34d-b388-46c9-a7e0-7106b0666478 {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2376.578450] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14668574-4d06-4ec9-ab69-67393ec191b3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.584245] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for the task: (returnval){ [ 2376.584245] env[68492]: value = "task-3395592" [ 2376.584245] env[68492]: _type = "Task" [ 2376.584245] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2376.592070] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': task-3395592, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2377.018430] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Preparing fetch location {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2377.018777] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating directory with path [datastore2] vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2377.019062] env[68492]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9c661ad-53d1-40e6-9e24-b51d41069f21 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.030631] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Created directory with path [datastore2] vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2377.030876] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Fetch image to [datastore2] vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2377.031024] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to [datastore2] vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk on the data store datastore2 {{(pid=68492) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2377.031700] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f5c336-07bd-49ea-b878-78e81de6cae6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.038336] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf4d2522-626f-4b5b-8bc5-b8173ab34098 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.048099] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af73e43b-d7cf-462d-b55e-d529a86dafa2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.079280] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db6d73d1-c43a-471b-a9c4-3b6dcf3ec880 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.087509] env[68492]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c3c0e3fb-d2bd-40d4-bb3a-daa951619888 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.093491] env[68492]: DEBUG oslo_vmware.api [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Task: {'id': task-3395592, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069161} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2377.093911] env[68492]: DEBUG nova.virt.vmwareapi.ds_util [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Deleted the datastore file {{(pid=68492) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2377.094128] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Deleted contents of the VM from datastore datastore2 {{(pid=68492) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2377.094303] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2377.094539] env[68492]: INFO nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2377.098131] env[68492]: DEBUG nova.compute.claims [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Aborting claim: {{(pid=68492) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2377.098308] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.098520] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.118103] env[68492]: DEBUG nova.virt.vmwareapi.images [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: ffddeec8-4442-413c-a0a0-2cf2b110cf14] Downloading image file data 595bda25-3485-4d7e-9f66-50f61186cadc to the data store datastore2 {{(pid=68492) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2377.188177] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2377.250957] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Completed reading data from the image iterator. {{(pid=68492) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2377.251176] env[68492]: DEBUG oslo_vmware.rw_handles [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0940092f-9e00-437d-a6b3-bfca845f12ca/595bda25-3485-4d7e-9f66-50f61186cadc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68492) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2377.284264] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7824cb4c-7121-4896-9dac-934ed4e00f01 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.292161] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e69f765-b216-4f20-bda0-6280e9177169 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.324768] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efbedde7-34ba-4978-b67c-8f3b5c0d1994 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.331804] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-532f2029-619d-48f1-aed2-4d25bd77acdc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.346719] env[68492]: DEBUG nova.compute.provider_tree [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2377.357466] env[68492]: DEBUG nova.scheduler.client.report [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2377.372233] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "bac43f46-a210-4c37-8fea-7ca57b902144" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.372464] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "bac43f46-a210-4c37-8fea-7ca57b902144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.373681] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2377.374287] env[68492]: ERROR nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2377.374287] env[68492]: Faults: ['InvalidArgument'] [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Traceback (most recent call last): [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self.driver.spawn(context, instance, image_meta, [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self._fetch_image_if_missing(context, vi) [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] image_cache(vi, tmp_image_ds_loc) [ 2377.374287] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] vm_util.copy_virtual_disk( [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] session._wait_for_task(vmdk_copy_task) [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return self.wait_for_task(task_ref) [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return evt.wait() [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] result = hub.switch() [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] return self.greenlet.switch() [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2377.374596] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] self.f(*self.args, **self.kw) [ 2377.374884] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2377.374884] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] raise exceptions.translate_fault(task_info.error) [ 2377.374884] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2377.374884] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Faults: ['InvalidArgument'] [ 2377.374884] env[68492]: ERROR nova.compute.manager [instance: c472a34d-b388-46c9-a7e0-7106b0666478] [ 2377.375281] env[68492]: DEBUG nova.compute.utils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] VimFaultException {{(pid=68492) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2377.376592] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Build of instance c472a34d-b388-46c9-a7e0-7106b0666478 was re-scheduled: A specified parameter was not correct: fileType [ 2377.376592] env[68492]: Faults: ['InvalidArgument'] {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2377.377031] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Unplugging VIFs for instance {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2377.377206] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68492) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2377.377383] env[68492]: DEBUG nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2377.377541] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2377.386827] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2377.437095] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.437346] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.438825] env[68492]: INFO nova.compute.claims [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2377.562671] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-057a2581-31d7-4e8f-998c-11b4dc284c17 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.571459] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f12edc-c506-43c2-ae90-c2082e2fcc1d {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.605831] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5db03594-24f0-42cf-9eea-af856b298a32 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.613324] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abcb8dac-f155-46bd-8ffd-83a29dedc8e4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.627203] env[68492]: DEBUG nova.compute.provider_tree [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2377.647745] env[68492]: DEBUG nova.scheduler.client.report [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2377.669569] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2377.670157] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2377.713727] env[68492]: DEBUG nova.network.neutron [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2377.729591] env[68492]: DEBUG nova.compute.utils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2377.731084] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2377.731354] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2377.734297] env[68492]: INFO nova.compute.manager [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Took 0.36 seconds to deallocate network for instance. [ 2377.747822] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2377.808798] env[68492]: DEBUG nova.policy [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '568ab24cbb7d4833bb8cdfd51db89db5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80fa34aee50b4509a18abca39075924a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 2377.836480] env[68492]: INFO nova.scheduler.client.report [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Deleted allocations for instance c472a34d-b388-46c9-a7e0-7106b0666478 [ 2377.843181] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2377.862714] env[68492]: DEBUG oslo_concurrency.lockutils [None req-c433a3d1-ffe7-4179-b8e2-567eebcfe39e tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 522.291s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2377.862982] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 326.085s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.863227] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2377.863438] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2377.863605] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2377.865849] env[68492]: INFO nova.compute.manager [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Terminating instance [ 2377.867852] env[68492]: DEBUG nova.compute.manager [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Start destroying the instance on the hypervisor. {{(pid=68492) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2377.868069] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Destroying instance {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2377.868939] env[68492]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-77dc920e-b4d1-4e79-8aa0-91fed17fdec3 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.880553] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-025e134b-a7f9-4ff4-888c-f985826332e1 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.893598] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2377.893825] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2377.893981] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2377.894195] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2377.894500] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2377.894500] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2377.894677] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2377.894917] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2377.894986] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2377.895150] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2377.895324] env[68492]: DEBUG nova.virt.hardware [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2377.896531] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-527fe174-9809-4b46-9241-333e8ab6b26e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.904095] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-839fa027-2cc0-4a4c-8c6b-ae1f895b4e1f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2377.914399] env[68492]: WARNING nova.virt.vmwareapi.vmops [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c472a34d-b388-46c9-a7e0-7106b0666478 could not be found. [ 2377.914592] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Instance destroyed {{(pid=68492) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2377.914769] env[68492]: INFO nova.compute.manager [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2377.915014] env[68492]: DEBUG oslo.service.loopingcall [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2377.916478] env[68492]: DEBUG nova.compute.manager [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Deallocating network for instance {{(pid=68492) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2377.916478] env[68492]: DEBUG nova.network.neutron [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] deallocate_for_instance() {{(pid=68492) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 2377.949659] env[68492]: DEBUG nova.network.neutron [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Updating instance_info_cache with network_info: [] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2377.957598] env[68492]: INFO nova.compute.manager [-] [instance: c472a34d-b388-46c9-a7e0-7106b0666478] Took 0.04 seconds to deallocate network for instance. [ 2378.052679] env[68492]: DEBUG oslo_concurrency.lockutils [None req-e6ffe587-83a5-4932-adbb-9c32b42c130d tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "c472a34d-b388-46c9-a7e0-7106b0666478" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2378.112280] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Successfully created port: b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2379.090960] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Successfully updated port: b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2379.105340] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2379.105489] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2379.105640] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Building network info cache for instance {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 2379.158175] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Instance cache missing network info. {{(pid=68492) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 2379.375593] env[68492]: DEBUG nova.network.neutron [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Updating instance_info_cache with network_info: [{"id": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "address": "fa:16:3e:9b:ef:79", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e9bc2e-e7", "ovs_interfaceid": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2379.388603] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2379.388928] env[68492]: DEBUG nova.compute.manager [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Instance network_info: |[{"id": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "address": "fa:16:3e:9b:ef:79", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e9bc2e-e7", "ovs_interfaceid": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2379.389358] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:ef:79', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35342bcb-8b06-472e-b3c0-43fd3d6c4b30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c', 'vif_model': 'vmxnet3'}] {{(pid=68492) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2379.402023] env[68492]: DEBUG oslo.service.loopingcall [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68492) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2379.403228] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Creating VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2379.403607] env[68492]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f69b18f6-9d25-4416-8ba1-873efa5472b6 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.428588] env[68492]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2379.428588] env[68492]: value = "task-3395593" [ 2379.428588] env[68492]: _type = "Task" [ 2379.428588] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2379.438490] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395593, 'name': CreateVM_Task} progress is 0%. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2379.658551] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.658831] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.675671] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2379.719561] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "09e79812-c221-443d-bf1b-ec89f44cb631" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.719801] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "09e79812-c221-443d-bf1b-ec89f44cb631" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.733663] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Starting instance... {{(pid=68492) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2379.749150] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.749150] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.749315] env[68492]: INFO nova.compute.claims [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2379.780926] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.791017] env[68492]: DEBUG nova.compute.manager [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Received event network-vif-plugged-b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2379.791017] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Acquiring lock "bac43f46-a210-4c37-8fea-7ca57b902144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.791017] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Lock "bac43f46-a210-4c37-8fea-7ca57b902144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.791017] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Lock "bac43f46-a210-4c37-8fea-7ca57b902144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2379.791373] env[68492]: DEBUG nova.compute.manager [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] No waiting events found dispatching network-vif-plugged-b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2379.791373] env[68492]: WARNING nova.compute.manager [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Received unexpected event network-vif-plugged-b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c for instance with vm_state building and task_state spawning. [ 2379.791514] env[68492]: DEBUG nova.compute.manager [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Received event network-changed-b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2379.791953] env[68492]: DEBUG nova.compute.manager [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Refreshing instance network info cache due to event network-changed-b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c. {{(pid=68492) external_instance_event /opt/stack/nova/nova/compute/manager.py:11112}} [ 2379.792632] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Acquiring lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2379.793025] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Acquired lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2379.794178] env[68492]: DEBUG nova.network.neutron [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Refreshing network info cache for port b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c {{(pid=68492) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 2379.898276] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a641b010-174b-43e6-9cf6-0405a861728f {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.908520] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-957dad90-3076-4ec8-8f82-1143fc13c3ae {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.941568] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf5d681f-3ff0-4256-a98e-466a2ed96fc5 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.951459] env[68492]: DEBUG oslo_vmware.api [-] Task: {'id': task-3395593, 'name': CreateVM_Task, 'duration_secs': 0.302508} completed successfully. {{(pid=68492) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2379.953661] env[68492]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Created VM on the ESX host {{(pid=68492) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2379.954519] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2379.954818] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2379.955272] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2379.956586] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7816d52f-1336-45b6-9ccf-a35811c0c468 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.960538] env[68492]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2d2d79bf-4eca-44e4-b847-6e2d2fcd0cb0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.965499] env[68492]: DEBUG oslo_vmware.api [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Waiting for the task: (returnval){ [ 2379.965499] env[68492]: value = "session[52aa75e3-97e3-c62c-0f0b-5b59bc3dabee]5247d0bd-89d6-9cc1-d001-653e738939d8" [ 2379.965499] env[68492]: _type = "Task" [ 2379.965499] env[68492]: } to complete. {{(pid=68492) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2379.973696] env[68492]: DEBUG nova.compute.provider_tree [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2379.990318] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Releasing lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2379.990774] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Processing image 595bda25-3485-4d7e-9f66-50f61186cadc {{(pid=68492) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2379.991133] env[68492]: DEBUG oslo_concurrency.lockutils [None req-7208eaee-8084-424b-93f3-8228ff4a7c4d tempest-ServersTestJSON-1176539008 tempest-ServersTestJSON-1176539008-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/595bda25-3485-4d7e-9f66-50f61186cadc/595bda25-3485-4d7e-9f66-50f61186cadc.vmdk" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2380.004590] env[68492]: DEBUG nova.scheduler.client.report [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2380.045136] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.045136] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2380.046320] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.266s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2380.047806] env[68492]: INFO nova.compute.claims [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2380.130374] env[68492]: DEBUG nova.compute.utils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2380.131788] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2380.131963] env[68492]: DEBUG nova.network.neutron [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2380.164872] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2380.225158] env[68492]: DEBUG oslo_service.periodic_task [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68492) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2380.239059] env[68492]: DEBUG nova.network.neutron [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Updated VIF entry in instance network info cache for port b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c. {{(pid=68492) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 2380.239438] env[68492]: DEBUG nova.network.neutron [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] [instance: bac43f46-a210-4c37-8fea-7ca57b902144] Updating instance_info_cache with network_info: [{"id": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "address": "fa:16:3e:9b:ef:79", "network": {"id": "776d3f34-1122-4482-904f-fb5a8883a13d", "bridge": "br-int", "label": "tempest-ServersTestJSON-709876682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "80fa34aee50b4509a18abca39075924a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35342bcb-8b06-472e-b3c0-43fd3d6c4b30", "external-id": "nsx-vlan-transportzone-524", "segmentation_id": 524, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e9bc2e-e7", "ovs_interfaceid": "b5e9bc2e-e79f-47e2-9969-8f6bf4c15b6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68492) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2380.243132] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Getting list of instances from cluster (obj){ [ 2380.243132] env[68492]: value = "domain-c8" [ 2380.243132] env[68492]: _type = "ClusterComputeResource" [ 2380.243132] env[68492]: } {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2380.244463] env[68492]: DEBUG nova.policy [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eca85f521b2f4a9c9ecf05120198f3de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4155239cd01a410fa600f06c709fe5c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 2380.249835] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e1b98c4-21ee-4199-b413-544fa233c8fc {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.253382] env[68492]: DEBUG oslo_concurrency.lockutils [req-6aecbacd-3f69-4168-8d66-4a8e96e1e120 req-58597220-b313-442d-a470-addbf9e79e0c service nova] Releasing lock "refresh_cache-bac43f46-a210-4c37-8fea-7ca57b902144" {{(pid=68492) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2380.263264] env[68492]: DEBUG nova.virt.vmwareapi.vmops [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Got total of 5 instances {{(pid=68492) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2380.263428] env[68492]: WARNING nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] While synchronizing instance power states, found 7 instances in the database and 5 instances on the hypervisor. [ 2380.263568] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid ffddeec8-4442-413c-a0a0-2cf2b110cf14 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.263756] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 75bbcae2-54ab-47d2-9bf8-b55b0881fb90 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.263915] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 696b560c-f4ed-4105-87e9-e5380a468fe1 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.264099] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 62a40c52-fae7-4025-b0af-1c2124e4d6f5 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.264270] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid bac43f46-a210-4c37-8fea-7ca57b902144 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.264610] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.264819] env[68492]: DEBUG nova.compute.manager [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Triggering sync for uuid 09e79812-c221-443d-bf1b-ec89f44cb631 {{(pid=68492) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10327}} [ 2380.266469] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "ffddeec8-4442-413c-a0a0-2cf2b110cf14" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.266713] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "75bbcae2-54ab-47d2-9bf8-b55b0881fb90" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.266951] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "696b560c-f4ed-4105-87e9-e5380a468fe1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.267172] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "62a40c52-fae7-4025-b0af-1c2124e4d6f5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.267371] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "bac43f46-a210-4c37-8fea-7ca57b902144" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.267566] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.267776] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6fbf7e5-4a69-4ac6-9b67-da1e3d1054b8 None None] Acquiring lock "09e79812-c221-443d-bf1b-ec89f44cb631" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2380.268563] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70d00af6-44d8-4e48-aa25-2c782449d3ef {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.271677] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2380.278528] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf77dacc-c216-4b1c-86db-b6c4f927d6a4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.310284] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d847e2bd-b419-4aec-9bd6-b04bbc0ca3b2 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.317117] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-954d5f91-915d-4f42-b76b-2d3c72bb10f4 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.323111] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2380.323349] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2380.323506] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2380.323689] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2380.323834] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2380.323981] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2380.324218] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2380.324382] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2380.324610] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2380.324703] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2380.324905] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2380.326123] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eded855-01e9-4df2-aa59-5f79b39ef1ee {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.339688] env[68492]: DEBUG nova.compute.provider_tree [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed in ProviderTree for provider: dba0d66f-84ca-40a4-90ee-609cf684af11 {{(pid=68492) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2380.346014] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0d38ce-089a-4729-b77f-6641b6e7e215 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.349118] env[68492]: DEBUG nova.scheduler.client.report [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Inventory has not changed for provider dba0d66f-84ca-40a4-90ee-609cf684af11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 102, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68492) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2380.363420] env[68492]: DEBUG oslo_concurrency.lockutils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=68492) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.363905] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Start building networks asynchronously for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2380.394511] env[68492]: DEBUG nova.compute.utils [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Using /dev/sd instead of None {{(pid=68492) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2380.395753] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Allocating IP information in the background. {{(pid=68492) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2380.395927] env[68492]: DEBUG nova.network.neutron [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] allocate_for_instance() {{(pid=68492) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2380.406767] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Start building block device mappings for instance. {{(pid=68492) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2380.477400] env[68492]: DEBUG nova.compute.manager [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Start spawning the instance on the hypervisor. {{(pid=68492) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2380.496408] env[68492]: DEBUG nova.policy [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eca85f521b2f4a9c9ecf05120198f3de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4155239cd01a410fa600f06c709fe5c6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68492) authorize /opt/stack/nova/nova/policy.py:203}} [ 2380.500566] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-10T14:54:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-10T14:54:06Z,direct_url=,disk_format='vmdk',id=595bda25-3485-4d7e-9f66-50f61186cadc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c89109061376457ab5ab750f8f509d25',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-10T14:54:07Z,virtual_size=,visibility=), allow threads: False {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2380.500802] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2380.501074] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image limits 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2380.501284] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Flavor pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2380.501490] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Image pref 0:0:0 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2380.501652] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68492) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2380.501865] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2380.502043] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2380.502218] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Got 1 possible topologies {{(pid=68492) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2380.502381] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2380.502554] env[68492]: DEBUG nova.virt.hardware [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68492) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2380.503454] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804ae2b6-9a92-4625-b561-2d5fa1de1fb0 {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.514422] env[68492]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5c7e9ee-7e08-4ced-8bce-b9ac974d677e {{(pid=68492) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2380.554931] env[68492]: DEBUG nova.network.neutron [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 0d8db1a2-d0cc-4a7d-a2f6-dcf6bc8dc6a1] Successfully created port: d70e39fb-38ec-4c5e-a241-55b44419a3ce {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2380.848370] env[68492]: DEBUG nova.network.neutron [None req-f6ff62f6-2dd8-4bb7-8120-9ede595a8e72 tempest-MultipleCreateTestJSON-465684580 tempest-MultipleCreateTestJSON-465684580-project-member] [instance: 09e79812-c221-443d-bf1b-ec89f44cb631] Successfully created port: f2e44f18-66dc-43d9-a17e-94f63ba315c8 {{(pid=68492) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}}